Nov 29 01:10:55 crc systemd[1]: Starting Kubernetes Kubelet... Nov 29 01:10:55 crc restorecon[4745]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:55 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 29 01:10:56 crc restorecon[4745]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 29 01:10:56 crc kubenswrapper[4749]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 01:10:56 crc kubenswrapper[4749]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 29 01:10:56 crc kubenswrapper[4749]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 01:10:56 crc kubenswrapper[4749]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 01:10:56 crc kubenswrapper[4749]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 29 01:10:56 crc kubenswrapper[4749]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.956471 4749 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961804 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961839 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961846 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961852 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961858 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961866 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961872 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961878 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961883 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961888 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961894 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961901 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961907 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961912 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961917 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961922 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961927 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961933 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961940 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961946 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961951 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961957 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961962 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961968 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961974 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961979 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961985 4749 feature_gate.go:330] unrecognized feature gate: Example Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961990 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.961996 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962001 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962006 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962011 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962017 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962022 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962027 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962032 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962037 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962042 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962048 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962053 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962058 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962063 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962070 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962076 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962082 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962087 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962091 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962096 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962102 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962108 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962113 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962118 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962122 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962129 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962134 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962140 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962145 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962150 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962155 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962160 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962165 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962170 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962175 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962181 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962188 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962226 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962234 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962240 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962245 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962250 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.962256 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962375 4749 flags.go:64] FLAG: --address="0.0.0.0" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962392 4749 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962405 4749 flags.go:64] FLAG: --anonymous-auth="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962415 4749 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962424 4749 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962431 4749 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962440 4749 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962447 4749 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962454 4749 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962459 4749 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962466 4749 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962472 4749 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962478 4749 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962484 4749 flags.go:64] FLAG: --cgroup-root="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962490 4749 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962496 4749 flags.go:64] FLAG: --client-ca-file="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962502 4749 flags.go:64] FLAG: --cloud-config="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962508 4749 flags.go:64] FLAG: --cloud-provider="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962513 4749 flags.go:64] FLAG: --cluster-dns="[]" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962522 4749 flags.go:64] FLAG: --cluster-domain="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962528 4749 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962534 4749 flags.go:64] FLAG: --config-dir="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962539 4749 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962545 4749 flags.go:64] FLAG: --container-log-max-files="5" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962561 4749 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962567 4749 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962573 4749 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962579 4749 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962585 4749 flags.go:64] FLAG: --contention-profiling="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962591 4749 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962597 4749 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962603 4749 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962608 4749 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962616 4749 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962622 4749 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962628 4749 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962634 4749 flags.go:64] FLAG: --enable-load-reader="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962640 4749 flags.go:64] FLAG: --enable-server="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962645 4749 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962652 4749 flags.go:64] FLAG: --event-burst="100" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962658 4749 flags.go:64] FLAG: --event-qps="50" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962665 4749 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962671 4749 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962676 4749 flags.go:64] FLAG: --eviction-hard="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962684 4749 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962689 4749 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962696 4749 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962703 4749 flags.go:64] FLAG: --eviction-soft="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962708 4749 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962714 4749 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962720 4749 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962725 4749 flags.go:64] FLAG: --experimental-mounter-path="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962731 4749 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962736 4749 flags.go:64] FLAG: --fail-swap-on="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962742 4749 flags.go:64] FLAG: --feature-gates="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962749 4749 flags.go:64] FLAG: --file-check-frequency="20s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962755 4749 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962761 4749 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962767 4749 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962774 4749 flags.go:64] FLAG: --healthz-port="10248" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962781 4749 flags.go:64] FLAG: --help="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962787 4749 flags.go:64] FLAG: --hostname-override="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962794 4749 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962801 4749 flags.go:64] FLAG: --http-check-frequency="20s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962807 4749 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962812 4749 flags.go:64] FLAG: --image-credential-provider-config="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962817 4749 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962824 4749 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962829 4749 flags.go:64] FLAG: --image-service-endpoint="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962835 4749 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962841 4749 flags.go:64] FLAG: --kube-api-burst="100" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962847 4749 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962853 4749 flags.go:64] FLAG: --kube-api-qps="50" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962859 4749 flags.go:64] FLAG: --kube-reserved="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962865 4749 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962870 4749 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962877 4749 flags.go:64] FLAG: --kubelet-cgroups="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962883 4749 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962889 4749 flags.go:64] FLAG: --lock-file="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962895 4749 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962901 4749 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962907 4749 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962916 4749 flags.go:64] FLAG: --log-json-split-stream="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962922 4749 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962928 4749 flags.go:64] FLAG: --log-text-split-stream="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962934 4749 flags.go:64] FLAG: --logging-format="text" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962939 4749 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962946 4749 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962952 4749 flags.go:64] FLAG: --manifest-url="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962958 4749 flags.go:64] FLAG: --manifest-url-header="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962966 4749 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962972 4749 flags.go:64] FLAG: --max-open-files="1000000" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962979 4749 flags.go:64] FLAG: --max-pods="110" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962985 4749 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962991 4749 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.962997 4749 flags.go:64] FLAG: --memory-manager-policy="None" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963003 4749 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963008 4749 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963015 4749 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963020 4749 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963035 4749 flags.go:64] FLAG: --node-status-max-images="50" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963041 4749 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963047 4749 flags.go:64] FLAG: --oom-score-adj="-999" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963052 4749 flags.go:64] FLAG: --pod-cidr="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963058 4749 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963068 4749 flags.go:64] FLAG: --pod-manifest-path="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963073 4749 flags.go:64] FLAG: --pod-max-pids="-1" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963079 4749 flags.go:64] FLAG: --pods-per-core="0" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963085 4749 flags.go:64] FLAG: --port="10250" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963091 4749 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963097 4749 flags.go:64] FLAG: --provider-id="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963103 4749 flags.go:64] FLAG: --qos-reserved="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963109 4749 flags.go:64] FLAG: --read-only-port="10255" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963116 4749 flags.go:64] FLAG: --register-node="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963121 4749 flags.go:64] FLAG: --register-schedulable="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963128 4749 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963137 4749 flags.go:64] FLAG: --registry-burst="10" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963143 4749 flags.go:64] FLAG: --registry-qps="5" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963149 4749 flags.go:64] FLAG: --reserved-cpus="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963154 4749 flags.go:64] FLAG: --reserved-memory="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963162 4749 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963168 4749 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963175 4749 flags.go:64] FLAG: --rotate-certificates="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963180 4749 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963186 4749 flags.go:64] FLAG: --runonce="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963211 4749 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963217 4749 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963223 4749 flags.go:64] FLAG: --seccomp-default="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963229 4749 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963236 4749 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963242 4749 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963247 4749 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963253 4749 flags.go:64] FLAG: --storage-driver-password="root" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963259 4749 flags.go:64] FLAG: --storage-driver-secure="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963271 4749 flags.go:64] FLAG: --storage-driver-table="stats" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963278 4749 flags.go:64] FLAG: --storage-driver-user="root" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963285 4749 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963292 4749 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963300 4749 flags.go:64] FLAG: --system-cgroups="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963307 4749 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963320 4749 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963327 4749 flags.go:64] FLAG: --tls-cert-file="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963335 4749 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963344 4749 flags.go:64] FLAG: --tls-min-version="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963349 4749 flags.go:64] FLAG: --tls-private-key-file="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963355 4749 flags.go:64] FLAG: --topology-manager-policy="none" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963361 4749 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963367 4749 flags.go:64] FLAG: --topology-manager-scope="container" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963374 4749 flags.go:64] FLAG: --v="2" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963382 4749 flags.go:64] FLAG: --version="false" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963390 4749 flags.go:64] FLAG: --vmodule="" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963397 4749 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.963403 4749 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963562 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963573 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963581 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963588 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963595 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963602 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963609 4749 feature_gate.go:330] unrecognized feature gate: Example Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963617 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963624 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963631 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963637 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963644 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963650 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963659 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963773 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963781 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963787 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963794 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963801 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963808 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963813 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963825 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963832 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963837 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963844 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963850 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963856 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963861 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963867 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963872 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963877 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963883 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963889 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963894 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963899 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963904 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963909 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963914 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963919 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963924 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963929 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963935 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963940 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963945 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963950 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963957 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963962 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963967 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963972 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963977 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963981 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963987 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.963991 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964001 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964013 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964023 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964030 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964037 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964042 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964047 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964053 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964058 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964062 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964068 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964072 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964077 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964082 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964089 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964094 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964099 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.964104 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.964121 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.971623 4749 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.971656 4749 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971741 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971750 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971759 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971767 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971775 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971782 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971788 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971795 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971802 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971809 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971816 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971822 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971826 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971831 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971837 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971841 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971846 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971852 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971860 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971868 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971876 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971883 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971890 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971896 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971902 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971908 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971915 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971921 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971928 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971936 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971942 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971947 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971952 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971957 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971963 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971968 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971973 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971978 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971983 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971987 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971992 4749 feature_gate.go:330] unrecognized feature gate: Example Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.971998 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972008 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972018 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972024 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972031 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972038 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972044 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972050 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972056 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972063 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972071 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972079 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972085 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972091 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972098 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972103 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972108 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972113 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972118 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972123 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972128 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972132 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972137 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972142 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972146 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972151 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972156 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972161 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972165 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972172 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.972182 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972441 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972459 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972469 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972478 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972486 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972492 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972499 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972505 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972512 4749 feature_gate.go:330] unrecognized feature gate: Example Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972521 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972528 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972534 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972541 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972547 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972553 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972559 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972565 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972572 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972580 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972587 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972592 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972598 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972602 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972607 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972612 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972617 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972622 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972627 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972632 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972636 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972641 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972646 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972651 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972658 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972664 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972670 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972675 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972681 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972686 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972692 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972697 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972702 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972707 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972713 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972718 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972724 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972729 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972734 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972739 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972743 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972748 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972753 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972759 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972763 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972769 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972775 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972780 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972786 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972792 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972797 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972802 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972807 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972811 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972816 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972821 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972827 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972832 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972837 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972842 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972848 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 29 01:10:56 crc kubenswrapper[4749]: W1129 01:10:56.972854 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.972861 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.973298 4749 server.go:940] "Client rotation is on, will bootstrap in background" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.976384 4749 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.976480 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.977031 4749 server.go:997] "Starting client certificate rotation" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.977057 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.977415 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-01 13:01:02.810145527 +0000 UTC Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.977493 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 803h50m5.832655126s for next certificate rotation Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.983002 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.984837 4749 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 01:10:56 crc kubenswrapper[4749]: I1129 01:10:56.992500 4749 log.go:25] "Validated CRI v1 runtime API" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.005259 4749 log.go:25] "Validated CRI v1 image API" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.006744 4749 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.009176 4749 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-29-01-06-31-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.009221 4749 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.024157 4749 manager.go:217] Machine: {Timestamp:2025-11-29 01:10:57.023018176 +0000 UTC m=+0.195168053 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ffaab6e0-7081-491c-af88-2b486225a952 BootID:0abfdf85-5794-49a6-a3fb-09e2a103db44 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:aa:b1:d3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:aa:b1:d3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c6:d6:79 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:bf:98:96 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b5:c1:39 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ab:8f:00 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:ae:23:47 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1e:77:f8:f2:d6:13 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:28:6a:64:b3:81 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.024362 4749 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.024458 4749 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.024855 4749 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.025015 4749 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.025042 4749 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.025297 4749 topology_manager.go:138] "Creating topology manager with none policy" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.025306 4749 container_manager_linux.go:303] "Creating device plugin manager" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.025492 4749 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.025516 4749 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.025649 4749 state_mem.go:36] "Initialized new in-memory state store" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.025724 4749 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.026294 4749 kubelet.go:418] "Attempting to sync node with API server" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.026312 4749 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.026334 4749 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.026346 4749 kubelet.go:324] "Adding apiserver pod source" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.026356 4749 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.027788 4749 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.028145 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 29 01:10:57 crc kubenswrapper[4749]: W1129 01:10:57.028627 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:57 crc kubenswrapper[4749]: W1129 01:10:57.028633 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.028734 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.028752 4749 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.028743 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029447 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029469 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029477 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029484 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029494 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029501 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029514 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029527 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029537 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029546 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029565 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029575 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.029925 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.030880 4749 server.go:1280] "Started kubelet" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.031327 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.031755 4749 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 29 01:10:57 crc systemd[1]: Started Kubernetes Kubelet. Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.032891 4749 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.033727 4749 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.033637 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187c550a5d99f523 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 01:10:57.030796579 +0000 UTC m=+0.202946456,LastTimestamp:2025-11-29 01:10:57.030796579 +0000 UTC m=+0.202946456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.036473 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.036516 4749 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.036537 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 02:05:52.707287132 +0000 UTC Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.036486 4749 server.go:460] "Adding debug handlers to kubelet server" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.036606 4749 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.036628 4749 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.036681 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.036743 4749 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.037350 4749 factory.go:55] Registering systemd factory Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.037415 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.037441 4749 factory.go:221] Registration of the systemd container factory successfully Nov 29 01:10:57 crc kubenswrapper[4749]: W1129 01:10:57.037364 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.037557 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.037924 4749 factory.go:153] Registering CRI-O factory Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.037946 4749 factory.go:221] Registration of the crio container factory successfully Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.038010 4749 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.038031 4749 factory.go:103] Registering Raw factory Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.038049 4749 manager.go:1196] Started watching for new ooms in manager Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.038801 4749 manager.go:319] Starting recovery of all containers Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055230 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055295 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055323 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055344 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055382 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055393 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055404 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055412 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055425 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055435 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055444 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055454 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055465 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055495 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055509 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055521 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055531 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055541 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055552 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055562 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055574 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055583 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055592 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055610 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055620 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055631 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055646 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055657 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055669 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055680 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055690 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055700 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055713 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055741 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055752 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055762 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055774 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055785 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055824 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055837 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055852 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055869 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055880 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055891 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055903 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055917 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055928 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055939 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055952 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055981 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.055995 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056012 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056042 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056079 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056093 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056105 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056119 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056132 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056144 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056156 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056166 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056177 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056186 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056216 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056227 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056238 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056250 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056260 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056270 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056279 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056288 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056298 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056308 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056318 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056328 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056339 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056350 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056360 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056369 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056382 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056394 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056404 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056414 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056426 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056437 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056446 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056458 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056468 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056480 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056492 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056504 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056515 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056525 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056535 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056545 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056555 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056566 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056576 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.056587 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057140 4749 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057170 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057182 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057223 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057236 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057250 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057272 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057288 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057301 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057313 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057327 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057339 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057353 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057366 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057380 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057393 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057406 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057417 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057429 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057443 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057452 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057464 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057473 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057485 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057497 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057509 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057521 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057533 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057544 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057553 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057562 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057571 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057580 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057590 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057602 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057612 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057637 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057646 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057655 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057665 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057675 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057686 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057697 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057711 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057726 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057738 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057750 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057760 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057782 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057794 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057807 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057822 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057835 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057845 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057859 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057885 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057899 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057911 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057923 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057935 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057947 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057956 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057965 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057973 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057982 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.057991 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058000 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058011 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058022 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058031 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058043 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058052 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058061 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058069 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058084 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058092 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058102 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058111 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058121 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058130 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058139 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058147 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058155 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058165 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058175 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058184 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058205 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058214 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058224 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058233 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058247 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058257 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058267 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058277 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058286 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058296 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058305 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058313 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058322 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058335 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058344 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058353 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058365 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058373 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058383 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058394 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058405 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058414 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058425 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058434 4749 reconstruct.go:97] "Volume reconstruction finished" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.058441 4749 reconciler.go:26] "Reconciler: start to sync state" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.063994 4749 manager.go:324] Recovery completed Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.072133 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.073663 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.073703 4749 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.073728 4749 kubelet.go:2335] "Starting kubelet main sync loop" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.073780 4749 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 29 01:10:57 crc kubenswrapper[4749]: W1129 01:10:57.074742 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.074911 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.075895 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.077410 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.077459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.077473 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.078118 4749 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.078139 4749 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.078165 4749 state_mem.go:36] "Initialized new in-memory state store" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.086531 4749 policy_none.go:49] "None policy: Start" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.088022 4749 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.088057 4749 state_mem.go:35] "Initializing new in-memory state store" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.137274 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.140356 4749 manager.go:334] "Starting Device Plugin manager" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.140411 4749 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.140422 4749 server.go:79] "Starting device plugin registration server" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.141149 4749 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.141166 4749 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.141405 4749 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.141513 4749 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.141521 4749 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.147540 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.174801 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.174888 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.175885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.175946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.175959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.176257 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.176414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.176456 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.177663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.177689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.177700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.177843 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.178275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.178341 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.178456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.178521 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.178543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.178848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.178874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.178882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.179119 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.179276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.179307 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.179582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.179625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.179634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.182900 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.183725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.183757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.183768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.183777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.183811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.183991 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.184140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.184214 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185145 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185169 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.185966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.238237 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.241313 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.242763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.242799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.242808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.242833 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.243211 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261637 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261748 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.261983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363675 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363752 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.363731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.364163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.444181 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.445833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.445889 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.445898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.445921 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.446447 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.500433 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.516283 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: W1129 01:10:57.519877 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e60af1a6f69e71b8793908be136367b495a5e9e03473ef417ebbff07a67fac1e WatchSource:0}: Error finding container e60af1a6f69e71b8793908be136367b495a5e9e03473ef417ebbff07a67fac1e: Status 404 returned error can't find the container with id e60af1a6f69e71b8793908be136367b495a5e9e03473ef417ebbff07a67fac1e Nov 29 01:10:57 crc kubenswrapper[4749]: W1129 01:10:57.530934 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-52774a0143e8389e8d972d111b2b03bdec8bf450809b11e48c68f175ee2d4f10 WatchSource:0}: Error finding container 52774a0143e8389e8d972d111b2b03bdec8bf450809b11e48c68f175ee2d4f10: Status 404 returned error can't find the container with id 52774a0143e8389e8d972d111b2b03bdec8bf450809b11e48c68f175ee2d4f10 Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.540757 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.548137 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.552551 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 29 01:10:57 crc kubenswrapper[4749]: W1129 01:10:57.554462 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-dbdf6cf6a201f50c59167e7bf64d715b5d156ab7ff083296d1b258036171e6f7 WatchSource:0}: Error finding container dbdf6cf6a201f50c59167e7bf64d715b5d156ab7ff083296d1b258036171e6f7: Status 404 returned error can't find the container with id dbdf6cf6a201f50c59167e7bf64d715b5d156ab7ff083296d1b258036171e6f7 Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.639892 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.846668 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.848849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.848882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.848893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:57 crc kubenswrapper[4749]: I1129 01:10:57.848921 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 01:10:57 crc kubenswrapper[4749]: E1129 01:10:57.849340 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.033055 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.037182 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 11:39:26.265970378 +0000 UTC Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.078955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.079051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dbdf6cf6a201f50c59167e7bf64d715b5d156ab7ff083296d1b258036171e6f7"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.080456 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e" exitCode=0 Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.080534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.080551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"52774a0143e8389e8d972d111b2b03bdec8bf450809b11e48c68f175ee2d4f10"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.080642 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.081465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.081490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.081499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.083342 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.084394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.084416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.084425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.085935 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b" exitCode=0 Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.085990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.086011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e60af1a6f69e71b8793908be136367b495a5e9e03473ef417ebbff07a67fac1e"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.086105 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.086772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.086813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.086821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.088586 4749 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="fa9e0d580e3dbc208cd16f14e5f64357dc5677f074d09e8cb68cfd9b7a848993" exitCode=0 Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.088619 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"fa9e0d580e3dbc208cd16f14e5f64357dc5677f074d09e8cb68cfd9b7a848993"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.088654 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b8614274ac975ae6ebacf36eeeefd3671ebd6d87379b1c54d7385c032c34e839"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.091251 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954" exitCode=0 Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.091277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.091291 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4d24e8ee9e7bde53f9b24dc9436ab416830e29e4839edd75e255c9365134b39c"} Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.091345 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.092047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.092089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.092104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:58 crc kubenswrapper[4749]: W1129 01:10:58.169435 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:58 crc kubenswrapper[4749]: E1129 01:10:58.169564 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 29 01:10:58 crc kubenswrapper[4749]: W1129 01:10:58.249560 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:58 crc kubenswrapper[4749]: E1129 01:10:58.249653 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 29 01:10:58 crc kubenswrapper[4749]: W1129 01:10:58.257575 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:58 crc kubenswrapper[4749]: E1129 01:10:58.257670 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 29 01:10:58 crc kubenswrapper[4749]: W1129 01:10:58.300603 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Nov 29 01:10:58 crc kubenswrapper[4749]: E1129 01:10:58.300673 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Nov 29 01:10:58 crc kubenswrapper[4749]: E1129 01:10:58.440868 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.649932 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.651561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.651591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.651602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:58 crc kubenswrapper[4749]: I1129 01:10:58.651631 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.037417 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 18:41:15.660527882 +0000 UTC Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.037482 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1049h30m16.623048828s for next certificate rotation Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.096331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.096384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.096400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.096411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.097793 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869" exitCode=0 Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.097868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.098026 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.099223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.099290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.099309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.100342 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.100373 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.100391 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.100472 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.101270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.101339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.101353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.104176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.104227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.104241 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.104284 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.104243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138"} Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.104989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.105025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.105044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.105093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.105119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.105131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:10:59 crc kubenswrapper[4749]: I1129 01:10:59.954029 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.112137 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521"} Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.112266 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.113983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.114044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.114066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.115922 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68" exitCode=0 Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.116001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68"} Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.116164 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.117355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.117387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.117397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.119514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7366637f85479972f5dfabd3be0f7b5e4ce2b2b588cfcb28a24cf6157b5911a6"} Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.119568 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.119658 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.119811 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.121607 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.121663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.121687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.122304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.122360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.122385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.122361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.122623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.122646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:00 crc kubenswrapper[4749]: I1129 01:11:00.541034 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:11:01 crc kubenswrapper[4749]: I1129 01:11:01.128892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7"} Nov 29 01:11:01 crc kubenswrapper[4749]: I1129 01:11:01.129001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456"} Nov 29 01:11:01 crc kubenswrapper[4749]: I1129 01:11:01.129033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1"} Nov 29 01:11:01 crc kubenswrapper[4749]: I1129 01:11:01.129045 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 01:11:01 crc kubenswrapper[4749]: I1129 01:11:01.129127 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:01 crc kubenswrapper[4749]: I1129 01:11:01.134920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:01 crc kubenswrapper[4749]: I1129 01:11:01.134995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:01 crc kubenswrapper[4749]: I1129 01:11:01.135016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.138824 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.138897 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.139552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab"} Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.139625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c"} Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.139753 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.140312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.140354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.140373 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.140982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.141069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.141131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.577858 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.578188 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.581188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.581488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.581663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:02 crc kubenswrapper[4749]: I1129 01:11:02.587533 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.141888 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.141888 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.143798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.143860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.143879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.144280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.144475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.144642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.678642 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.703118 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.703548 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.703654 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.705467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.705524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:03 crc kubenswrapper[4749]: I1129 01:11:03.705544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.113291 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.118434 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.145114 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.145225 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.146946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.147013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.147037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.147077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.147104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.147123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.237818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.238126 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.240318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.240398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:04 crc kubenswrapper[4749]: I1129 01:11:04.240425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.149350 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.151299 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.151385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.151414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.207148 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.207457 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.209217 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.209277 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:05 crc kubenswrapper[4749]: I1129 01:11:05.209296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:06 crc kubenswrapper[4749]: I1129 01:11:06.534672 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:11:06 crc kubenswrapper[4749]: I1129 01:11:06.535045 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:06 crc kubenswrapper[4749]: I1129 01:11:06.537580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:06 crc kubenswrapper[4749]: I1129 01:11:06.537760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:06 crc kubenswrapper[4749]: I1129 01:11:06.537783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:07 crc kubenswrapper[4749]: E1129 01:11:07.147749 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 29 01:11:08 crc kubenswrapper[4749]: E1129 01:11:08.653310 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Nov 29 01:11:09 crc kubenswrapper[4749]: I1129 01:11:09.032748 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 29 01:11:09 crc kubenswrapper[4749]: I1129 01:11:09.535309 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 29 01:11:09 crc kubenswrapper[4749]: I1129 01:11:09.535463 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 01:11:09 crc kubenswrapper[4749]: I1129 01:11:09.817558 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 29 01:11:09 crc kubenswrapper[4749]: I1129 01:11:09.817667 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 29 01:11:09 crc kubenswrapper[4749]: I1129 01:11:09.822235 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 29 01:11:09 crc kubenswrapper[4749]: I1129 01:11:09.822310 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 29 01:11:10 crc kubenswrapper[4749]: I1129 01:11:10.253576 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:10 crc kubenswrapper[4749]: I1129 01:11:10.255304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:10 crc kubenswrapper[4749]: I1129 01:11:10.255417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:10 crc kubenswrapper[4749]: I1129 01:11:10.255478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:10 crc kubenswrapper[4749]: I1129 01:11:10.255570 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 29 01:11:13 crc kubenswrapper[4749]: I1129 01:11:13.714498 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:11:13 crc kubenswrapper[4749]: I1129 01:11:13.714775 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:13 crc kubenswrapper[4749]: I1129 01:11:13.716672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:13 crc kubenswrapper[4749]: I1129 01:11:13.716770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:13 crc kubenswrapper[4749]: I1129 01:11:13.716792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:13 crc kubenswrapper[4749]: I1129 01:11:13.723035 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.121869 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.122105 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.123648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.123718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.123738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.177586 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.178940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.179004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.179023 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:14 crc kubenswrapper[4749]: E1129 01:11:14.809442 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.813524 4749 trace.go:236] Trace[70249703]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 01:11:01.457) (total time: 13355ms): Nov 29 01:11:14 crc kubenswrapper[4749]: Trace[70249703]: ---"Objects listed" error: 13355ms (01:11:14.813) Nov 29 01:11:14 crc kubenswrapper[4749]: Trace[70249703]: [13.35595057s] [13.35595057s] END Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.813583 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.814184 4749 trace.go:236] Trace[1725852564]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 01:11:00.504) (total time: 14309ms): Nov 29 01:11:14 crc kubenswrapper[4749]: Trace[1725852564]: ---"Objects listed" error: 14309ms (01:11:14.814) Nov 29 01:11:14 crc kubenswrapper[4749]: Trace[1725852564]: [14.309473079s] [14.309473079s] END Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.814230 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.815086 4749 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.815243 4749 trace.go:236] Trace[1366142466]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 01:11:00.959) (total time: 13856ms): Nov 29 01:11:14 crc kubenswrapper[4749]: Trace[1366142466]: ---"Objects listed" error: 13856ms (01:11:14.815) Nov 29 01:11:14 crc kubenswrapper[4749]: Trace[1366142466]: [13.856032821s] [13.856032821s] END Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.815268 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.815387 4749 trace.go:236] Trace[1286112769]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (29-Nov-2025 01:11:01.407) (total time: 13407ms): Nov 29 01:11:14 crc kubenswrapper[4749]: Trace[1286112769]: ---"Objects listed" error: 13407ms (01:11:14.815) Nov 29 01:11:14 crc kubenswrapper[4749]: Trace[1286112769]: [13.407882866s] [13.407882866s] END Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.815405 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.845721 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51048->192.168.126.11:17697: read: connection reset by peer" start-of-body= Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.845798 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51048->192.168.126.11:17697: read: connection reset by peer" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.846176 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.846241 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.846678 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 29 01:11:14 crc kubenswrapper[4749]: I1129 01:11:14.846800 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.037745 4749 apiserver.go:52] "Watching apiserver" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.041422 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.041822 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.042348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.042471 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.042534 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.042689 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.042734 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.043151 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.043252 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.043288 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.043448 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.044750 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.045993 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.046366 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.046375 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.046546 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.046717 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.046743 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.046908 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.046996 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.080871 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.102256 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.121088 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.133124 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.137420 4749 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.145914 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.162365 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.172831 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.181692 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.184124 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521" exitCode=255 Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.184175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.198155 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.198522 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.198625 4749 scope.go:117] "RemoveContainer" containerID="fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.212703 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216636 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216683 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216711 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216783 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216813 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216833 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216854 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216898 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216922 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.216982 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217103 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217146 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217173 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217219 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217262 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217375 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217470 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217497 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217536 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217562 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217606 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217651 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217706 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217730 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217755 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217830 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217949 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217645 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217838 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.217939 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218006 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218050 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218037 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218158 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218255 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218436 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218460 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218563 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218710 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.218920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.219144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.219222 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.219353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.219490 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.219718 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.220278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.220646 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.221177 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.221332 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.221515 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.221786 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222026 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222287 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222534 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222884 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.222994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223039 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223247 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223285 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223303 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223401 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223474 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223528 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223591 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223611 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223668 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223686 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223709 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223728 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223764 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223868 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223905 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223946 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223965 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224003 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224023 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224042 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224061 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224082 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224103 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224260 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224281 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224318 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224335 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224372 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224466 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224540 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224575 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224591 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224606 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224622 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224640 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224660 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224678 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224734 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224772 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224790 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224810 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224828 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224846 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224864 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224883 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224898 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224932 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224987 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225005 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225024 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225059 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225077 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225166 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225183 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225235 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225417 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225439 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225428 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225821 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225879 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225910 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225994 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226045 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226109 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226229 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226368 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226394 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226424 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226448 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226486 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226513 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226551 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226579 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226603 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226714 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226764 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226792 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226844 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226868 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227030 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227107 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227345 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227495 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227513 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227528 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227542 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227557 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227571 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227586 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227600 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227627 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227642 4749 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227657 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227671 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227684 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227699 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227713 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227728 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227743 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227758 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227772 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227797 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227811 4749 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227825 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227840 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227854 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227875 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227890 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227904 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227918 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227932 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227945 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227959 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227972 4749 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227997 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228011 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228025 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228041 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228066 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228080 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228093 4749 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.230045 4749 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.233138 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.240374 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.244461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.244655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.246580 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.223845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224169 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.224773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.225381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.248667 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.248681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226117 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.248860 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226210 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226344 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226245 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226825 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226899 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226907 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.226936 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227310 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.227927 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228786 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.228657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.229174 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.229660 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.230057 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.230390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.230550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.231544 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.232435 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.233147 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.233158 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.233438 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.233813 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.233840 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.233900 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.234365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.234578 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:11:15.734547639 +0000 UTC m=+18.906697516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.234662 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.234703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.249353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.234713 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.234757 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.234960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.235164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.236720 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.236743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.236885 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.237317 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.249544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.237410 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.237542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.237556 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.238456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.249654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.238538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.238596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.238785 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.238861 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.239068 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.239139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.239248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.239279 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.239450 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.238649 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.240110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.240363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.240059 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.242611 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.244507 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.249859 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.246633 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.246924 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.249933 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:15.749910526 +0000 UTC m=+18.922060393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.246977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.247178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.247690 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.247722 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.247829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.248343 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.248338 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.250051 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:15.750032479 +0000 UTC m=+18.922182336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.248419 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.250209 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.250372 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.250864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.251008 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.251064 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.253209 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.254169 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.255133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.255785 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.256628 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.256942 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.257158 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.258023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.258724 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.258922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.259848 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.260836 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.260923 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.260996 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.261115 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:15.761093021 +0000 UTC m=+18.933242878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.261328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.260689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.261464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.262291 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.262601 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.263471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.263726 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.264174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.267119 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.267143 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.267157 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.267234 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:15.767214021 +0000 UTC m=+18.939363878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.268511 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.269411 4749 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.269557 4749 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.270024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.270078 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.270248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.270265 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.270355 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.271936 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.272415 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.272419 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.273496 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.273604 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.273633 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.274042 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.274442 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.275408 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.275468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.275505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.275652 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.276480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.276851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.277266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.277302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.277353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.277431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.277458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.277475 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.277737 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.278437 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.278574 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.278654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.278756 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.279083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.279749 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.280058 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.280478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.280560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.280849 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.284503 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.284683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.284840 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.285091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.285341 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.285481 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.285581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.285718 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.286724 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.286806 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.286922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.287005 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.287050 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.288339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.288415 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.288485 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.288527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.288552 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.288573 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.288591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.288608 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.291874 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.296719 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.299884 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.302491 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.305849 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.308077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.308112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.308123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.308139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.308151 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.314944 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.315663 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.318440 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.321542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.322095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.322175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.322252 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.322334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.322400 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.322481 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.327459 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328799 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328907 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328919 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328928 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328937 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328948 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328957 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328967 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328976 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328987 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.328995 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329076 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329090 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329100 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329112 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329123 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329160 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329180 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329191 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329220 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329230 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329240 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329252 4749 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329265 4749 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329277 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329287 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329297 4749 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329306 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329070 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329316 4749 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329356 4749 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329368 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329380 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329390 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329400 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329410 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329418 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329429 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329439 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329450 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329461 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329471 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329480 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329491 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329500 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329509 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329537 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329547 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329557 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329567 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329576 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329587 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329597 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329605 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329614 4749 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329624 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329637 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329646 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329658 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329668 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329679 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329688 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329697 4749 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329705 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329714 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329723 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329732 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329741 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329750 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329761 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329770 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329779 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329799 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329807 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329816 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329825 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329834 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329842 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329855 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329864 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329873 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329882 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329891 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329899 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329908 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329917 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329925 4749 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329934 4749 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329948 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329956 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329964 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329973 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329983 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.329992 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330004 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330013 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330020 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330029 4749 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330037 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330045 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330054 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330067 4749 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330076 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330084 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330092 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330101 4749 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330109 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330117 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330126 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330134 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330142 4749 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330150 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330159 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330173 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330182 4749 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330191 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330214 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330221 4749 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330230 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330238 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330246 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330255 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330263 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330272 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330283 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330291 4749 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330300 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330308 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330316 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330325 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330334 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330342 4749 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330351 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330361 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330370 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330379 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330388 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330398 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330405 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330414 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330422 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330432 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330440 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330448 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330456 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330465 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330473 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330482 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330490 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330499 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330507 4749 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330516 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330524 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330533 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330541 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330553 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.330563 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.331983 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.337801 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.338943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.339003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.339016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.339038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.339052 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.346928 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.349321 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.353152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.353191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.353220 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.353237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.353249 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.356088 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.360421 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.361131 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.361400 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.362919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.362958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.362971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.362988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.363000 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.369239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.378619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 29 01:11:15 crc kubenswrapper[4749]: W1129 01:11:15.396988 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-1a45ed6320d03dd4393d06db4c0275199f0c78dc59ab34313286a21ae683d116 WatchSource:0}: Error finding container 1a45ed6320d03dd4393d06db4c0275199f0c78dc59ab34313286a21ae683d116: Status 404 returned error can't find the container with id 1a45ed6320d03dd4393d06db4c0275199f0c78dc59ab34313286a21ae683d116 Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.467107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.467652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.467670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.467717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.467733 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.570149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.570184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.570214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.570232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.570241 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.672410 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.672459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.672468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.672484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.672493 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.734687 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.734880 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:11:16.734860798 +0000 UTC m=+19.907010665 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.775133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.775171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.775183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.775223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.775240 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.835805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.835835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.835868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.835889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.835956 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.835993 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:16.835982392 +0000 UTC m=+20.008132249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836011 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836049 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836046 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836133 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836173 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836174 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:16.836145096 +0000 UTC m=+20.008294993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836074 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836230 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836456 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:16.836409063 +0000 UTC m=+20.008559070 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:15 crc kubenswrapper[4749]: E1129 01:11:15.836498 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:16.836487444 +0000 UTC m=+20.008637311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.878150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.878188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.878210 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.878223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.878233 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.980917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.980979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.981009 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.981039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:15 crc kubenswrapper[4749]: I1129 01:11:15.981060 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:15Z","lastTransitionTime":"2025-11-29T01:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.085320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.085393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.085413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.085448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.085472 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.188350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.188431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.188457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.188511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.188543 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.189411 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.192353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.192762 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.194663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.194810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.194894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1a45ed6320d03dd4393d06db4c0275199f0c78dc59ab34313286a21ae683d116"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.195596 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"acf98cb2e973c8f270625d2509c196b416422671b546305eccbde5cc47d3c0a5"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.197330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.197395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c0b3ae60f534d0a0dba624dfe2a49e543738b9b9e0007a0d9eefc53ee7de4021"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.233555 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.262411 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.291559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.291639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.291658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.291689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.291716 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.292129 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.319969 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.341426 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.366286 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.388169 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.395782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.395835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.395848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.395868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.395882 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.404613 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.440479 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.461357 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.484064 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.497626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.497664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.497674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.497689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.497699 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.505760 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.522505 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.538928 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.542288 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.544991 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.549057 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.557019 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.571139 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.594736 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.599481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.599510 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.599520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.599537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.599547 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.610728 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.629545 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.647465 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.662021 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.678839 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.694007 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.702465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.702506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.702516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.702532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.702543 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.718011 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.731163 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:16Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.743603 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.743730 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:11:18.743712249 +0000 UTC m=+21.915862106 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.804734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.804786 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.804799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.804815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.804826 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.844330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.844375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.844393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.844411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844480 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844514 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844549 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844568 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844580 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844523 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:18.844511014 +0000 UTC m=+22.016660871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844626 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844755 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844767 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844779 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:18.844668828 +0000 UTC m=+22.016818725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844823 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:18.844805102 +0000 UTC m=+22.016954999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:16 crc kubenswrapper[4749]: E1129 01:11:16.844875 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:18.844860503 +0000 UTC m=+22.017010400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.907899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.907992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.908015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.908051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:16 crc kubenswrapper[4749]: I1129 01:11:16.908073 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:16Z","lastTransitionTime":"2025-11-29T01:11:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.010919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.010962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.010972 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.010988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.010998 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.074037 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.074037 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:17 crc kubenswrapper[4749]: E1129 01:11:17.074156 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.074174 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:17 crc kubenswrapper[4749]: E1129 01:11:17.074231 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:17 crc kubenswrapper[4749]: E1129 01:11:17.074483 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.081281 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.083029 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.085361 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.086678 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.087537 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.088331 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.088579 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.089123 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.090582 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.091273 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.091927 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.092622 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.093390 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.093942 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.094595 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.095187 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.095731 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.096527 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.099254 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.101166 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.103150 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.104331 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.106674 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.107793 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.108189 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.110497 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.111812 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.113714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.113760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.113772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.113790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.113801 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.114401 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.115821 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.117221 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.118149 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.118984 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.120311 4749 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.120536 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.123026 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.124307 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.124782 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.125666 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.126864 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.127587 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.128114 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.128760 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.129537 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.130106 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.130814 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.131647 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.132343 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.132856 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.134489 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.135031 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.135804 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.136274 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.136770 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.137300 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.137801 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.138364 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.138879 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.140414 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.164184 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.183782 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.198857 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.212120 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.215902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.216035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.216151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.216276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.216401 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.234982 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.319964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.320259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.320337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.320399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.320460 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.423737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.423836 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.423859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.423893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.423913 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.526740 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.526821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.526843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.526875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.526900 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.631130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.631236 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.631258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.631333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.631358 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.733718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.733770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.733779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.733796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.733806 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.836642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.836695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.836709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.836726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.836739 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.940702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.940755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.940771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.940789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:17 crc kubenswrapper[4749]: I1129 01:11:17.940798 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:17Z","lastTransitionTime":"2025-11-29T01:11:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.043976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.044013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.044023 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.044057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.044068 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.147471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.147518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.147527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.147560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.147570 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.205738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.237789 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.250976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.251048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.251075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.251109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.251132 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.260463 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.282353 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.315004 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.334355 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.354724 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.355341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.355379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.355387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.355405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.355417 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.377943 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.399482 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.421609 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:18Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.458719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.458815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.458834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.458863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.458893 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.561941 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.561978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.561988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.562005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.562017 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.665248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.665300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.665310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.665327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.665336 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.761556 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.761761 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:11:22.761742996 +0000 UTC m=+25.933892853 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.768058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.768150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.768170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.768231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.768256 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.862640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.862695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.862718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.862737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.862856 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.862916 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:22.862901191 +0000 UTC m=+26.035051048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.862986 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863104 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863128 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863009 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863261 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863285 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:22.8632511 +0000 UTC m=+26.035400987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863296 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863401 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:22.863371633 +0000 UTC m=+26.035521660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863449 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:18 crc kubenswrapper[4749]: E1129 01:11:18.863533 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:22.863517426 +0000 UTC m=+26.035667323 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.871384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.871426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.871436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.871450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.871461 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.974815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.974898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.974925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.974961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:18 crc kubenswrapper[4749]: I1129 01:11:18.974992 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:18Z","lastTransitionTime":"2025-11-29T01:11:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.074678 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.074670 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.074691 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:19 crc kubenswrapper[4749]: E1129 01:11:19.075049 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:19 crc kubenswrapper[4749]: E1129 01:11:19.074829 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:19 crc kubenswrapper[4749]: E1129 01:11:19.075371 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.078485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.078547 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.078575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.078611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.078637 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.185757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.185799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.185809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.185827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.185837 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.290119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.290168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.290179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.290209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.290227 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.393460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.393520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.393531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.393545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.393554 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.496866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.496909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.496919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.496935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.496946 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.599749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.599809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.599834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.599859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.599875 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.702678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.702729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.702742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.702766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.702783 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.805086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.805138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.805152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.805175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.805188 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.907387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.907430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.907442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.907466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:19 crc kubenswrapper[4749]: I1129 01:11:19.907478 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:19Z","lastTransitionTime":"2025-11-29T01:11:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.009518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.009555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.009567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.009582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.009591 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.112323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.112359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.112367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.112381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.112391 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.216774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.216828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.216838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.216861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.216874 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.319117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.319178 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.319190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.319228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.319242 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.384140 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2gf7g"] Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.384463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.384594 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-75zrr"] Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.385015 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:20 crc kubenswrapper[4749]: W1129 01:11:20.386219 4749 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Nov 29 01:11:20 crc kubenswrapper[4749]: E1129 01:11:20.386284 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 01:11:20 crc kubenswrapper[4749]: W1129 01:11:20.386757 4749 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Nov 29 01:11:20 crc kubenswrapper[4749]: W1129 01:11:20.386772 4749 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Nov 29 01:11:20 crc kubenswrapper[4749]: E1129 01:11:20.386823 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 01:11:20 crc kubenswrapper[4749]: E1129 01:11:20.386862 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 01:11:20 crc kubenswrapper[4749]: W1129 01:11:20.387072 4749 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Nov 29 01:11:20 crc kubenswrapper[4749]: E1129 01:11:20.387139 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 01:11:20 crc kubenswrapper[4749]: W1129 01:11:20.387400 4749 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Nov 29 01:11:20 crc kubenswrapper[4749]: E1129 01:11:20.387442 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 01:11:20 crc kubenswrapper[4749]: W1129 01:11:20.387399 4749 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Nov 29 01:11:20 crc kubenswrapper[4749]: E1129 01:11:20.387539 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 01:11:20 crc kubenswrapper[4749]: W1129 01:11:20.387713 4749 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Nov 29 01:11:20 crc kubenswrapper[4749]: E1129 01:11:20.387745 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 01:11:20 crc kubenswrapper[4749]: W1129 01:11:20.390294 4749 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Nov 29 01:11:20 crc kubenswrapper[4749]: E1129 01:11:20.390338 4749 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.406127 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.422337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.422408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.422426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.422461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.422479 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.442249 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.457110 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.479095 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.502759 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.519640 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.524286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.524335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.524344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.524360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.524370 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.533068 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.567954 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-cnibin\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-kubelet\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-multus-certs\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576643 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-cni-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-cni-bin\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-cni-binary-copy\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576747 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-cni-multus\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvvn\" (UniqueName: \"kubernetes.io/projected/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-kube-api-access-4gvvn\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-system-cni-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-os-release\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d508d053-4b4d-472c-afa0-43a89560cdf7-hosts-file\") pod \"node-resolver-75zrr\" (UID: \"d508d053-4b4d-472c-afa0-43a89560cdf7\") " pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-conf-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.576973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-hostroot\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.577081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-netns\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.577150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-socket-dir-parent\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.577225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-k8s-cni-cncf-io\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.577267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-daemon-config\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.577298 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-etc-kubernetes\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.577338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgbrm\" (UniqueName: \"kubernetes.io/projected/d508d053-4b4d-472c-afa0-43a89560cdf7-kube-api-access-kgbrm\") pod \"node-resolver-75zrr\" (UID: \"d508d053-4b4d-472c-afa0-43a89560cdf7\") " pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.590425 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.610940 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.626085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.626127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.626138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.626155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.626166 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.641341 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.660617 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.672185 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-cni-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-cnibin\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678350 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-kubelet\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678367 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-multus-certs\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-cni-bin\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-cni-binary-copy\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-cni-multus\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvvn\" (UniqueName: \"kubernetes.io/projected/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-kube-api-access-4gvvn\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-system-cni-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-os-release\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d508d053-4b4d-472c-afa0-43a89560cdf7-hosts-file\") pod \"node-resolver-75zrr\" (UID: \"d508d053-4b4d-472c-afa0-43a89560cdf7\") " pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-cni-bin\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-conf-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-kubelet\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678597 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-cni-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-os-release\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-netns\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-conf-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-var-lib-cni-multus\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-netns\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678597 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-system-cni-dir\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-hostroot\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d508d053-4b4d-472c-afa0-43a89560cdf7-hosts-file\") pod \"node-resolver-75zrr\" (UID: \"d508d053-4b4d-472c-afa0-43a89560cdf7\") " pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-multus-certs\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-hostroot\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-cnibin\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-socket-dir-parent\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-k8s-cni-cncf-io\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-socket-dir-parent\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgbrm\" (UniqueName: \"kubernetes.io/projected/d508d053-4b4d-472c-afa0-43a89560cdf7-kube-api-access-kgbrm\") pod \"node-resolver-75zrr\" (UID: \"d508d053-4b4d-472c-afa0-43a89560cdf7\") " pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.678982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-host-run-k8s-cni-cncf-io\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.679026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-daemon-config\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.679083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-etc-kubernetes\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.679119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-etc-kubernetes\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.681207 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.699545 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.713787 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.728504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.728544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.728561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.728595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.728607 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.735004 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.748609 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.763513 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.773292 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wz6xx"] Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.773965 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mnsct"] Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.774083 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.774615 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.775847 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.776478 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.776729 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.777456 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.777536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.779297 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.780888 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m7sg4"] Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.781259 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.781616 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.783825 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.787439 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.787513 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.787560 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.789321 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.789486 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.789630 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.789896 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.803448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.823737 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.830821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.830872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.830881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.830912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.830926 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.851770 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.871307 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.880892 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.880936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-ovn\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.880954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52dcfbe3-4017-41b0-a1a6-f117eb831499-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.880975 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/800b3936-ba93-47d8-9417-2fdc5ce4d171-mcd-auth-proxy-config\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.880990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-log-socket\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-script-lib\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2f7\" (UniqueName: \"kubernetes.io/projected/52d1a95a-c900-4842-82c4-5f4c37a16fee-kube-api-access-pr2f7\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881220 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-systemd-units\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-env-overrides\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-os-release\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881270 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-var-lib-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881307 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28qfq\" (UniqueName: \"kubernetes.io/projected/800b3936-ba93-47d8-9417-2fdc5ce4d171-kube-api-access-28qfq\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881326 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-system-cni-dir\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881341 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-slash\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-node-log\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovn-node-metrics-cert\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-systemd\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-config\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/800b3936-ba93-47d8-9417-2fdc5ce4d171-rootfs\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-bin\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mr44\" (UniqueName: \"kubernetes.io/projected/52dcfbe3-4017-41b0-a1a6-f117eb831499-kube-api-access-4mr44\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881540 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-netns\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-netd\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/800b3936-ba93-47d8-9417-2fdc5ce4d171-proxy-tls\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52dcfbe3-4017-41b0-a1a6-f117eb831499-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-etc-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-cnibin\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-kubelet\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.881714 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-ovn-kubernetes\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.887945 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.900515 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.919090 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.931977 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.932374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.932404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.932414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.932429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.932440 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:20Z","lastTransitionTime":"2025-11-29T01:11:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.945111 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.957974 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.970161 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-systemd\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982426 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-config\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982520 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/800b3936-ba93-47d8-9417-2fdc5ce4d171-rootfs\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982546 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-bin\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mr44\" (UniqueName: \"kubernetes.io/projected/52dcfbe3-4017-41b0-a1a6-f117eb831499-kube-api-access-4mr44\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-netns\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-systemd\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-netd\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/800b3936-ba93-47d8-9417-2fdc5ce4d171-rootfs\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/800b3936-ba93-47d8-9417-2fdc5ce4d171-proxy-tls\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-netd\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-bin\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52dcfbe3-4017-41b0-a1a6-f117eb831499-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-etc-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-netns\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-cnibin\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-cnibin\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-kubelet\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-etc-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.982942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-kubelet\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.983441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-ovn-kubernetes\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.983552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.983587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-ovn\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.983603 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.983622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52dcfbe3-4017-41b0-a1a6-f117eb831499-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.983559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-ovn-kubernetes\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.983692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-ovn\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/800b3936-ba93-47d8-9417-2fdc5ce4d171-mcd-auth-proxy-config\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-log-socket\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-script-lib\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr2f7\" (UniqueName: \"kubernetes.io/projected/52d1a95a-c900-4842-82c4-5f4c37a16fee-kube-api-access-pr2f7\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984429 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-systemd-units\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-env-overrides\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-log-socket\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-os-release\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984429 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52dcfbe3-4017-41b0-a1a6-f117eb831499-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-var-lib-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984611 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-os-release\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28qfq\" (UniqueName: \"kubernetes.io/projected/800b3936-ba93-47d8-9417-2fdc5ce4d171-kube-api-access-28qfq\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984692 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-system-cni-dir\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-slash\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-node-log\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984813 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovn-node-metrics-cert\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-systemd-units\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.985045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-system-cni-dir\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.984620 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-var-lib-openvswitch\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.985104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-slash\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.985101 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-env-overrides\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.985163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-node-log\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.985163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-script-lib\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.986651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/800b3936-ba93-47d8-9417-2fdc5ce4d171-mcd-auth-proxy-config\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.986550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-config\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.986845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52dcfbe3-4017-41b0-a1a6-f117eb831499-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.991337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/800b3936-ba93-47d8-9417-2fdc5ce4d171-proxy-tls\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.992716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovn-node-metrics-cert\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:20 crc kubenswrapper[4749]: I1129 01:11:20.998930 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:20Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.003058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28qfq\" (UniqueName: \"kubernetes.io/projected/800b3936-ba93-47d8-9417-2fdc5ce4d171-kube-api-access-28qfq\") pod \"machine-config-daemon-mnsct\" (UID: \"800b3936-ba93-47d8-9417-2fdc5ce4d171\") " pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.005177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr2f7\" (UniqueName: \"kubernetes.io/projected/52d1a95a-c900-4842-82c4-5f4c37a16fee-kube-api-access-pr2f7\") pod \"ovnkube-node-m7sg4\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.009863 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:21Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.034554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.034793 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.034904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.035020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.035113 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.035577 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:21Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.074302 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.074363 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:21 crc kubenswrapper[4749]: E1129 01:11:21.074714 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.074387 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:21 crc kubenswrapper[4749]: E1129 01:11:21.074758 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:21 crc kubenswrapper[4749]: E1129 01:11:21.075001 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.100352 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.107735 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.148577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.148926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.148939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.148957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.148969 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.203338 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.217996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"38e389a370f7a3bd521b153442d9ad90da8e6250ad222a568a1602b522522a3a"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.218937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"11278297036fca26ce69b7465e00ad85e251a4539fcc7919e898d0a3452aec89"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.250968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.250998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.251007 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.251020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.251030 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.279362 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.289304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvvn\" (UniqueName: \"kubernetes.io/projected/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-kube-api-access-4gvvn\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.290958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mr44\" (UniqueName: \"kubernetes.io/projected/52dcfbe3-4017-41b0-a1a6-f117eb831499-kube-api-access-4mr44\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.308700 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.310124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-multus-daemon-config\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.353354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.353397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.353407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.353442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.353454 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.455614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.455647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.455658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.455675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.455685 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.494636 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.545184 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.557984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.558044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.558062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.558089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.558106 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.660949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.660986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.660996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.661011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.661022 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.661123 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.664842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52dcfbe3-4017-41b0-a1a6-f117eb831499-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz6xx\" (UID: \"52dcfbe3-4017-41b0-a1a6-f117eb831499\") " pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.670017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/454ec33e-9530-4cf0-ad08-9c3a21b0e56b-cni-binary-copy\") pod \"multus-2gf7g\" (UID: \"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\") " pod="openshift-multus/multus-2gf7g" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.689665 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" Nov 29 01:11:21 crc kubenswrapper[4749]: E1129 01:11:21.691948 4749 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 29 01:11:21 crc kubenswrapper[4749]: E1129 01:11:21.692341 4749 projected.go:194] Error preparing data for projected volume kube-api-access-kgbrm for pod openshift-dns/node-resolver-75zrr: failed to sync configmap cache: timed out waiting for the condition Nov 29 01:11:21 crc kubenswrapper[4749]: E1129 01:11:21.692520 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d508d053-4b4d-472c-afa0-43a89560cdf7-kube-api-access-kgbrm podName:d508d053-4b4d-472c-afa0-43a89560cdf7 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:22.192488144 +0000 UTC m=+25.364638011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kgbrm" (UniqueName: "kubernetes.io/projected/d508d053-4b4d-472c-afa0-43a89560cdf7-kube-api-access-kgbrm") pod "node-resolver-75zrr" (UID: "d508d053-4b4d-472c-afa0-43a89560cdf7") : failed to sync configmap cache: timed out waiting for the condition Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.716529 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.764096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.764132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.764175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.764215 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.764231 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.854368 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.866501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.866544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.866557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.866577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.866590 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.907787 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gf7g" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.968792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.968853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.968862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.968880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:21 crc kubenswrapper[4749]: I1129 01:11:21.968892 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:21Z","lastTransitionTime":"2025-11-29T01:11:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.070909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.070970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.070981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.070995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.071005 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.173702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.174025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.174035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.174066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.174076 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.197497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgbrm\" (UniqueName: \"kubernetes.io/projected/d508d053-4b4d-472c-afa0-43a89560cdf7-kube-api-access-kgbrm\") pod \"node-resolver-75zrr\" (UID: \"d508d053-4b4d-472c-afa0-43a89560cdf7\") " pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.203099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgbrm\" (UniqueName: \"kubernetes.io/projected/d508d053-4b4d-472c-afa0-43a89560cdf7-kube-api-access-kgbrm\") pod \"node-resolver-75zrr\" (UID: \"d508d053-4b4d-472c-afa0-43a89560cdf7\") " pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.218753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-75zrr" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.221864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7g" event={"ID":"454ec33e-9530-4cf0-ad08-9c3a21b0e56b","Type":"ContainerStarted","Data":"57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.221914 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7g" event={"ID":"454ec33e-9530-4cf0-ad08-9c3a21b0e56b","Type":"ContainerStarted","Data":"970c31d2878ced459815c3910da9e01acdcce37e5be8f0d8b6e524f7f0c8b4d1"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.223632 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4" exitCode=0 Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.223694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.224877 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dcfbe3-4017-41b0-a1a6-f117eb831499" containerID="7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1" exitCode=0 Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.225041 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" event={"ID":"52dcfbe3-4017-41b0-a1a6-f117eb831499","Type":"ContainerDied","Data":"7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.225084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" event={"ID":"52dcfbe3-4017-41b0-a1a6-f117eb831499","Type":"ContainerStarted","Data":"f5948008c8ce2b604210ae8b1fabfdaa54ea8ecf99c9c6997f369415edb4ee32"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.228535 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.228570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.239299 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.252406 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.265608 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: W1129 01:11:22.270727 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd508d053_4b4d_472c_afa0_43a89560cdf7.slice/crio-ce7727d7827cf988621c509b75be2b698e3f30094edea41ed77b7f3a67162565 WatchSource:0}: Error finding container ce7727d7827cf988621c509b75be2b698e3f30094edea41ed77b7f3a67162565: Status 404 returned error can't find the container with id ce7727d7827cf988621c509b75be2b698e3f30094edea41ed77b7f3a67162565 Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.276090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.276118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.276126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.276138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.276146 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.282645 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.293569 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.307500 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.320396 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.338551 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.358522 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.371934 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.380726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.380774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.380786 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.380803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.380815 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.395655 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.409841 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.423264 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.438687 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.451037 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.462678 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.479079 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.482849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.482883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.482893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.482908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.482920 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.498961 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.513747 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.528891 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.542088 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.555760 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.567599 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.583374 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.584537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.584564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.584574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.584588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.584597 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.598096 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.609898 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.621107 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.644291 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:22Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.686667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.686700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.686708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.686722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.686734 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.789523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.789565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.789575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.789589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.789599 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.803126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.803371 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:11:30.803332979 +0000 UTC m=+33.975482866 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.891796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.891837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.891845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.891860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.891872 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.904299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.904337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.904358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.904378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904459 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904534 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904570 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904583 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904534 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904545 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:30.904528804 +0000 UTC m=+34.076678661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904853 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:30.904832412 +0000 UTC m=+34.076982259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904483 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904885 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904899 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.904950 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:30.904874813 +0000 UTC m=+34.077024830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:22 crc kubenswrapper[4749]: E1129 01:11:22.905089 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:30.905060688 +0000 UTC m=+34.077210835 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.997765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.997802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.997810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.997824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:22 crc kubenswrapper[4749]: I1129 01:11:22.997834 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:22Z","lastTransitionTime":"2025-11-29T01:11:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.009950 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kr9qp"] Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.010326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.012134 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.012735 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.012862 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.012990 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.024758 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.034534 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.046928 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.058323 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.074756 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.074781 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.074815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:23 crc kubenswrapper[4749]: E1129 01:11:23.074907 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:23 crc kubenswrapper[4749]: E1129 01:11:23.074954 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:23 crc kubenswrapper[4749]: E1129 01:11:23.075070 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.077393 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.100292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.100326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.100334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.100351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.100362 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.106418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46bc9153-d89a-4dbe-a806-ae78091d27a3-host\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.106471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcwxx\" (UniqueName: \"kubernetes.io/projected/46bc9153-d89a-4dbe-a806-ae78091d27a3-kube-api-access-dcwxx\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.106511 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46bc9153-d89a-4dbe-a806-ae78091d27a3-serviceca\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.111579 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.150960 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.161520 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.170126 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.180518 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.190528 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.201514 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.202771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.202802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.202810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.202824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.202834 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.207594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46bc9153-d89a-4dbe-a806-ae78091d27a3-serviceca\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.207678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46bc9153-d89a-4dbe-a806-ae78091d27a3-host\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.207726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcwxx\" (UniqueName: \"kubernetes.io/projected/46bc9153-d89a-4dbe-a806-ae78091d27a3-kube-api-access-dcwxx\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.208087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46bc9153-d89a-4dbe-a806-ae78091d27a3-host\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.209558 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46bc9153-d89a-4dbe-a806-ae78091d27a3-serviceca\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.211902 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.225367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcwxx\" (UniqueName: \"kubernetes.io/projected/46bc9153-d89a-4dbe-a806-ae78091d27a3-kube-api-access-dcwxx\") pod \"node-ca-kr9qp\" (UID: \"46bc9153-d89a-4dbe-a806-ae78091d27a3\") " pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.235347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.235407 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.235425 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.235438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.235449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.235459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.236564 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.237161 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dcfbe3-4017-41b0-a1a6-f117eb831499" containerID="774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b" exitCode=0 Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.237220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" event={"ID":"52dcfbe3-4017-41b0-a1a6-f117eb831499","Type":"ContainerDied","Data":"774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.238555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-75zrr" event={"ID":"d508d053-4b4d-472c-afa0-43a89560cdf7","Type":"ContainerStarted","Data":"0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.238608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-75zrr" event={"ID":"d508d053-4b4d-472c-afa0-43a89560cdf7","Type":"ContainerStarted","Data":"ce7727d7827cf988621c509b75be2b698e3f30094edea41ed77b7f3a67162565"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.251590 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.267813 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.279177 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.293461 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.304948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.304987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.304996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.305011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.305020 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.317094 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.322302 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kr9qp" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.334868 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: W1129 01:11:23.337237 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46bc9153_d89a_4dbe_a806_ae78091d27a3.slice/crio-27343c0372bac624a3a350f11dc44c091d19407bd1083e802c4869dba400f5e7 WatchSource:0}: Error finding container 27343c0372bac624a3a350f11dc44c091d19407bd1083e802c4869dba400f5e7: Status 404 returned error can't find the container with id 27343c0372bac624a3a350f11dc44c091d19407bd1083e802c4869dba400f5e7 Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.350075 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.365896 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.392645 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.406830 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.408529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.408571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.408582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.408599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.408610 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.417568 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.430490 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.445314 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.458086 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.475957 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.485259 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:23Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.510936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.510962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.510970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.510984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.510996 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.612669 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.612703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.612712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.612724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.612734 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.714924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.714980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.714998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.715021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.715038 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.818031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.818102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.818124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.818240 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.818281 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.921353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.921404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.921414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.921431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:23 crc kubenswrapper[4749]: I1129 01:11:23.921440 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:23Z","lastTransitionTime":"2025-11-29T01:11:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.025484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.025519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.025530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.025551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.025562 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.128607 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.128686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.128705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.128732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.128751 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.232110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.232546 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.232564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.232589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.232607 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.246291 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dcfbe3-4017-41b0-a1a6-f117eb831499" containerID="69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d" exitCode=0 Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.246405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" event={"ID":"52dcfbe3-4017-41b0-a1a6-f117eb831499","Type":"ContainerDied","Data":"69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.248775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kr9qp" event={"ID":"46bc9153-d89a-4dbe-a806-ae78091d27a3","Type":"ContainerStarted","Data":"68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.248895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kr9qp" event={"ID":"46bc9153-d89a-4dbe-a806-ae78091d27a3","Type":"ContainerStarted","Data":"27343c0372bac624a3a350f11dc44c091d19407bd1083e802c4869dba400f5e7"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.287156 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.305850 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.321184 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.336075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.336173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.336244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.336286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.336310 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.341688 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.366987 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.385833 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.406034 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.418411 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.432603 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.440553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.440606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.440616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.440641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.440655 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.444519 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.462041 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.486726 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.498651 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.516904 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.534340 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.543457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.543512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.543530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.543554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.543576 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.554461 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.574078 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.597156 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.609040 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.627221 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.642280 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.646431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.646466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.646477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.646497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.646510 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.661979 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.692485 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.711240 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.729670 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.750379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.750459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.750481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.750514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.750537 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.752769 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.774276 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.796521 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.815800 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.844922 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:24Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.853831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.853898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.853916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.853945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.853969 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.956476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.956514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.956523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.956537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:24 crc kubenswrapper[4749]: I1129 01:11:24.956553 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:24Z","lastTransitionTime":"2025-11-29T01:11:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.058920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.058975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.058989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.059014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.059031 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.074797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.075006 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.075028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.075030 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.075329 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.075383 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.161890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.161924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.161934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.161996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.162006 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.262488 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dcfbe3-4017-41b0-a1a6-f117eb831499" containerID="413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5" exitCode=0 Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.262604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" event={"ID":"52dcfbe3-4017-41b0-a1a6-f117eb831499","Type":"ContainerDied","Data":"413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.264737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.264769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.264786 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.264809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.264827 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.271435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.283274 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.299871 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.318145 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.340866 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.364415 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.367866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.367909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.367921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.367940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.367954 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.379650 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.442084 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.463032 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.472089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.472148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.472161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.472181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.472209 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.479786 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.500319 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.502120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.502151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.502166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.502188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.502230 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.515526 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.517319 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.520754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.520811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.520836 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.520869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.520893 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.532775 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.539606 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.546087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.546138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.546150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.546172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.546187 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.551058 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.558934 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.563722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.563750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.563761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.563777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.563789 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.568778 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.583359 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.590430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.590467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.590474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.590493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.590503 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.604884 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.605794 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:25Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:25 crc kubenswrapper[4749]: E1129 01:11:25.605927 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.607772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.607797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.607815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.607833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.607846 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.711999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.712129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.712340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.712512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.712691 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.816026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.816301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.816365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.816427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.816497 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.919391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.919612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.919813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.919968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:25 crc kubenswrapper[4749]: I1129 01:11:25.920111 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:25Z","lastTransitionTime":"2025-11-29T01:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.024587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.024946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.025060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.025161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.025277 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.128177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.128425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.128566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.128663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.128759 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.231324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.232378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.232422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.232448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.232467 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.279234 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dcfbe3-4017-41b0-a1a6-f117eb831499" containerID="189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b" exitCode=0 Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.279284 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" event={"ID":"52dcfbe3-4017-41b0-a1a6-f117eb831499","Type":"ContainerDied","Data":"189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.300997 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.323160 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.335572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.335612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.335624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.335641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.335657 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.341015 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.374451 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.396009 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.410642 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.424590 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.440781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.441565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.442427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.442453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.442477 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.444161 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.460637 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.480676 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.502152 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.518865 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.542476 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.544897 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.544930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.544941 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.544957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.544968 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.556070 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.585517 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:26Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.647597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.647643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.647655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.647673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.647684 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.750365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.750440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.750460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.750492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.750513 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.855143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.855183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.855191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.855228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.855238 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.957687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.957754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.957765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.957779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:26 crc kubenswrapper[4749]: I1129 01:11:26.957791 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:26Z","lastTransitionTime":"2025-11-29T01:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.062248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.062285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.062297 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.062311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.062326 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.074035 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:27 crc kubenswrapper[4749]: E1129 01:11:27.074128 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.074314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:27 crc kubenswrapper[4749]: E1129 01:11:27.074434 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.074531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:27 crc kubenswrapper[4749]: E1129 01:11:27.074582 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.089480 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.117156 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.134012 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.148700 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.162277 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.164569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.164602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.164616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.164637 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.164651 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.176488 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.191500 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.204160 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.244539 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.269453 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.270976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.271003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.271014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.271031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.271041 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.293651 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.293971 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.294030 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.294039 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.294134 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.297468 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dcfbe3-4017-41b0-a1a6-f117eb831499" containerID="a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280" exitCode=0 Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.297495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" event={"ID":"52dcfbe3-4017-41b0-a1a6-f117eb831499","Type":"ContainerDied","Data":"a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.306920 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.317753 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.318247 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.323386 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.337981 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.351136 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.373095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.373135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.373145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.373160 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.373172 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.378343 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.389919 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.400564 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.415101 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.427170 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.439022 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.451120 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.463516 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.478259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.478298 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.478306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.478320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.478329 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.481018 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.493815 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.507727 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.524794 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.537644 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.551148 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.562637 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:27Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.581767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.581820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.581832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.581854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.581871 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.684216 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.684249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.684258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.684273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.684302 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.786840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.786910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.786930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.786962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.787019 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.889963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.890033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.890049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.890075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.890104 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.992305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.992343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.992355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.992375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:27 crc kubenswrapper[4749]: I1129 01:11:27.992389 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:27Z","lastTransitionTime":"2025-11-29T01:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.095797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.095878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.095902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.095932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.095955 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.199234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.199340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.199367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.199396 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.199417 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.301684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.301752 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.301771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.301800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.301818 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.306608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" event={"ID":"52dcfbe3-4017-41b0-a1a6-f117eb831499","Type":"ContainerStarted","Data":"4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.330942 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.355127 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.384886 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.405886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.405942 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.405951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.405967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.405978 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.409148 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.424821 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.441324 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.463270 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.480530 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.509873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.510481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.510750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.511021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.511597 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.513789 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.542377 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.567101 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.588937 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.615543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.615881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.615957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.616062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.616148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.626394 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.646305 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.668100 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:28Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.719521 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.719625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.719651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.719681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.719700 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.822555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.822603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.822615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.822636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.822649 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.925350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.925763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.925850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.925976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:28 crc kubenswrapper[4749]: I1129 01:11:28.926075 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:28Z","lastTransitionTime":"2025-11-29T01:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.029725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.029792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.029805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.029846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.029859 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.074727 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.074861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:29 crc kubenswrapper[4749]: E1129 01:11:29.074887 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:29 crc kubenswrapper[4749]: E1129 01:11:29.075069 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.074750 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:29 crc kubenswrapper[4749]: E1129 01:11:29.075464 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.132994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.133047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.133061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.133085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.133101 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.236276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.236378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.236401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.236434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.236454 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.339778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.339833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.339842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.339862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.339873 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.443731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.443812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.443832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.443858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.443875 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.547863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.548118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.548127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.548143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.548153 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.651366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.651441 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.651459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.651484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.651505 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.754805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.754867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.754883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.754909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.754936 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.858579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.858653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.858672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.858707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.858768 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.962526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.962572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.962581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.962600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:29 crc kubenswrapper[4749]: I1129 01:11:29.962613 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:29Z","lastTransitionTime":"2025-11-29T01:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.065475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.065573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.065597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.065634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.065661 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.168536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.168633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.168653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.168683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.168709 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.272187 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.272292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.272310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.272353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.272372 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.318585 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/0.log" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.323543 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be" exitCode=1 Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.323618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.324914 4749 scope.go:117] "RemoveContainer" containerID="11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.347366 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.372071 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.381852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.381915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.381933 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.381958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.381976 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.397833 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.422882 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.435153 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.451501 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.471663 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.484150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.484217 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.484235 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.484256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.484272 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.490452 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.523759 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:29Z\\\",\\\"message\\\":\\\"8 for removal\\\\nI1129 01:11:29.481367 6117 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1129 01:11:29.481359 6117 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:29.481372 6117 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1129 01:11:29.481388 6117 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:29.481399 6117 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:29.481431 6117 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1129 01:11:29.481460 6117 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1129 01:11:29.481664 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:29.481694 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:29.481729 6117 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 01:11:29.481734 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:29.481739 6117 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 01:11:29.481766 6117 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:29.481789 6117 handler.go:208] Removed *v1.Node event handler 7\\\\nI1129 01:11:29.481817 6117 factory.go:656] Stopping watch factory\\\\nI1129 01:11:29.481846 6117 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.540758 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.557104 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.573528 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.586999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.587040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.587048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.587081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.587091 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.595919 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.610140 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.622527 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:30Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.689271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.689381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.689396 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.689414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.689426 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.791403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.791443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.791457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.791476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.791487 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.805014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.805273 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:11:46.805252338 +0000 UTC m=+49.977402195 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.894805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.894862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.894877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.894903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.894916 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.906366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.906425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.906453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.906472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906545 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906640 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906682 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:46.906650799 +0000 UTC m=+50.078800816 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906727 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:46.90670538 +0000 UTC m=+50.078855227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906650 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906751 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906763 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906776 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906851 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906811 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:46.906795432 +0000 UTC m=+50.078945289 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.906876 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:30 crc kubenswrapper[4749]: E1129 01:11:30.907031 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:46.906997577 +0000 UTC m=+50.079147434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.997898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.997957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.997968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.997990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:30 crc kubenswrapper[4749]: I1129 01:11:30.998003 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:30Z","lastTransitionTime":"2025-11-29T01:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.074903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.075033 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:31 crc kubenswrapper[4749]: E1129 01:11:31.075116 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.074903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:31 crc kubenswrapper[4749]: E1129 01:11:31.075185 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:31 crc kubenswrapper[4749]: E1129 01:11:31.075354 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.100931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.100985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.100995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.101016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.101027 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.246080 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.246114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.246123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.246138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.246148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.329939 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/0.log" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.334715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.335773 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.350715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.350764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.350775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.350791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.350803 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.363786 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.378687 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.398347 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.425622 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.442291 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.453538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.453703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.453813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.453893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.453985 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.455925 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.476121 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.495352 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.513125 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.530988 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.545674 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.557049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.557256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.557352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.557683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.557810 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.563886 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.577270 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.590525 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.611496 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:29Z\\\",\\\"message\\\":\\\"8 for removal\\\\nI1129 01:11:29.481367 6117 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1129 01:11:29.481359 6117 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:29.481372 6117 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1129 01:11:29.481388 6117 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:29.481399 6117 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:29.481431 6117 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1129 01:11:29.481460 6117 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1129 01:11:29.481664 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:29.481694 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:29.481729 6117 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 01:11:29.481734 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:29.481739 6117 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 01:11:29.481766 6117 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:29.481789 6117 handler.go:208] Removed *v1.Node event handler 7\\\\nI1129 01:11:29.481817 6117 factory.go:656] Stopping watch factory\\\\nI1129 01:11:29.481846 6117 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:31Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.660319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.660372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.660385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.660403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.660416 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.763192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.763261 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.763273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.763291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.763304 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.866822 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.866890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.866909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.866954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.866973 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.970457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.970549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.970566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.970594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:31 crc kubenswrapper[4749]: I1129 01:11:31.970616 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:31Z","lastTransitionTime":"2025-11-29T01:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.074615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.074680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.074698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.074724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.074745 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.177292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.177356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.177366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.177379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.177389 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.281414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.281475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.281488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.281507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.281520 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.342781 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/1.log" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.344345 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/0.log" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.350285 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be" exitCode=1 Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.350376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.350485 4749 scope.go:117] "RemoveContainer" containerID="11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.353312 4749 scope.go:117] "RemoveContainer" containerID="c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be" Nov 29 01:11:32 crc kubenswrapper[4749]: E1129 01:11:32.355032 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.373102 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.387901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.387982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.388004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.388033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.388052 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.389696 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.405470 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.429976 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://11553056e7a24e8af1bbfb734308760f1dd901dc7a2ee223b1463fdbb7ec78be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:29Z\\\",\\\"message\\\":\\\"8 for removal\\\\nI1129 01:11:29.481367 6117 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1129 01:11:29.481359 6117 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:29.481372 6117 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1129 01:11:29.481388 6117 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:29.481399 6117 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:29.481431 6117 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1129 01:11:29.481460 6117 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1129 01:11:29.481664 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:29.481694 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:29.481729 6117 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1129 01:11:29.481734 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:29.481739 6117 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1129 01:11:29.481766 6117 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:29.481789 6117 handler.go:208] Removed *v1.Node event handler 7\\\\nI1129 01:11:29.481817 6117 factory.go:656] Stopping watch factory\\\\nI1129 01:11:29.481846 6117 ovnkube.go:599] Stopped ovnkube\\\\nI11\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.451990 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.474998 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.491011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.491087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.491101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.491120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.491138 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.496101 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.516984 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.535242 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.549080 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.572416 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.589833 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.594053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.594140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.594157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.594173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.594220 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.607922 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.622606 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.632547 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:32Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.696151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.696214 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.696227 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.696244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.696258 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.799270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.799313 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.799323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.799340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.799352 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.901692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.901754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.901772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.901798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:32 crc kubenswrapper[4749]: I1129 01:11:32.901815 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:32Z","lastTransitionTime":"2025-11-29T01:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.003959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.003994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.004006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.004131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.004143 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.074725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:33 crc kubenswrapper[4749]: E1129 01:11:33.074911 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.075158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.075326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:33 crc kubenswrapper[4749]: E1129 01:11:33.075431 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:33 crc kubenswrapper[4749]: E1129 01:11:33.075447 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.106044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.106087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.106096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.106111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.106120 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.208456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.208495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.208507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.208524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.208535 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.310947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.311022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.311044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.311078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.311099 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.356806 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/1.log" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.361757 4749 scope.go:117] "RemoveContainer" containerID="c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be" Nov 29 01:11:33 crc kubenswrapper[4749]: E1129 01:11:33.361909 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.374962 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.392333 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.406090 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.417029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.417103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.417124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.417154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.417173 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.434782 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.457105 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.474902 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.493318 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.508143 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.520640 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.520687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.520697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.520715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.520727 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.526804 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.551660 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.564665 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.578927 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.594882 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.607528 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.622848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.622955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.622981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.623014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.623033 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.638098 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.724910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.724942 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.724951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.724964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.724974 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.826987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.827033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.827043 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.827058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.827069 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.922626 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn"] Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.923110 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.925075 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.925636 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.929188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.929237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.929246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.929260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.929269 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:33Z","lastTransitionTime":"2025-11-29T01:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.950118 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.964408 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.979892 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:33 crc kubenswrapper[4749]: I1129 01:11:33.993839 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:33Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.005024 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.021258 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.031453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.031494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.031506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.031523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.031533 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.037811 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.054420 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.077901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlt2\" (UniqueName: \"kubernetes.io/projected/a6486350-f678-4175-a288-633b1ff9365d-kube-api-access-fjlt2\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.077949 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6486350-f678-4175-a288-633b1ff9365d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.077985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6486350-f678-4175-a288-633b1ff9365d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.078008 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6486350-f678-4175-a288-633b1ff9365d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.080456 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.093500 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.107378 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.122644 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.133577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.133642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.133659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.133684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.133700 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.149307 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.174599 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.179101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlt2\" (UniqueName: \"kubernetes.io/projected/a6486350-f678-4175-a288-633b1ff9365d-kube-api-access-fjlt2\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.179300 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6486350-f678-4175-a288-633b1ff9365d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.179391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6486350-f678-4175-a288-633b1ff9365d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.179443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6486350-f678-4175-a288-633b1ff9365d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.180553 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6486350-f678-4175-a288-633b1ff9365d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.180733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6486350-f678-4175-a288-633b1ff9365d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.190155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6486350-f678-4175-a288-633b1ff9365d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.193697 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.199414 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlt2\" (UniqueName: \"kubernetes.io/projected/a6486350-f678-4175-a288-633b1ff9365d-kube-api-access-fjlt2\") pod \"ovnkube-control-plane-749d76644c-wj9gn\" (UID: \"a6486350-f678-4175-a288-633b1ff9365d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.212643 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.235243 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.238550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.238608 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.238626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.238653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.238674 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.241871 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.265259 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.278390 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.300671 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.329658 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.344120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.344165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.344180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.344212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.344226 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.350258 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.365500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" event={"ID":"a6486350-f678-4175-a288-633b1ff9365d","Type":"ContainerStarted","Data":"3d9202f1472fb3580063487527db8c09073367f11b7d24d48fbf5324c9edda9f"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.369022 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.381549 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.397895 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.414729 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.427430 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.442129 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.446330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.446375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.446388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.446409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.446423 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.455963 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.470814 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.483563 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.494516 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.519879 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.548943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.549084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.549187 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.549315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.549461 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.651963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.652011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.652022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.652036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.652048 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.655086 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-nczdn"] Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.655620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:34 crc kubenswrapper[4749]: E1129 01:11:34.655771 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.672788 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.685765 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.700550 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.733552 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.751298 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.754433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.754483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.754497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.754517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.754533 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.767000 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.784700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtrb6\" (UniqueName: \"kubernetes.io/projected/2bba1226-0e27-4cea-9eaa-d653f2061ec1-kube-api-access-mtrb6\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.784782 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.788125 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.811017 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.829705 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.843709 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.857150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.857212 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.857225 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.857245 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.857259 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.864745 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.881725 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.885802 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtrb6\" (UniqueName: \"kubernetes.io/projected/2bba1226-0e27-4cea-9eaa-d653f2061ec1-kube-api-access-mtrb6\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.885880 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:34 crc kubenswrapper[4749]: E1129 01:11:34.886119 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:34 crc kubenswrapper[4749]: E1129 01:11:34.886247 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs podName:2bba1226-0e27-4cea-9eaa-d653f2061ec1 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:35.386192107 +0000 UTC m=+38.558341994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs") pod "network-metrics-daemon-nczdn" (UID: "2bba1226-0e27-4cea-9eaa-d653f2061ec1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.898768 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.911086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtrb6\" (UniqueName: \"kubernetes.io/projected/2bba1226-0e27-4cea-9eaa-d653f2061ec1-kube-api-access-mtrb6\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.913617 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.932032 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.947518 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.960039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.960083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.960096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.960119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.960133 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:34Z","lastTransitionTime":"2025-11-29T01:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:34 crc kubenswrapper[4749]: I1129 01:11:34.962213 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:34Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.063970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.064035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.064054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.064082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.064101 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.074639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.074648 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.074849 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.074966 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.075548 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.075878 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.168049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.168093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.168107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.168130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.168146 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.270375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.270412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.270420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.270433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.270443 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.369947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" event={"ID":"a6486350-f678-4175-a288-633b1ff9365d","Type":"ContainerStarted","Data":"691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.369992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" event={"ID":"a6486350-f678-4175-a288-633b1ff9365d","Type":"ContainerStarted","Data":"abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.372016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.372063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.372075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.372092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.372102 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.382290 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.393331 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.393696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.393876 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.393962 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs podName:2bba1226-0e27-4cea-9eaa-d653f2061ec1 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:36.393943759 +0000 UTC m=+39.566093616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs") pod "network-metrics-daemon-nczdn" (UID: "2bba1226-0e27-4cea-9eaa-d653f2061ec1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.406093 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.423800 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.435564 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.443567 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.451782 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.462401 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.474935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.474987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.475000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.475019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.475031 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.476254 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.488180 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.502630 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.516932 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.527026 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.538134 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.549269 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.559974 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.578042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.578103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.578117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.578136 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.578150 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.586347 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.680772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.680820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.680857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.680877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.680889 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.783913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.783977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.783997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.784022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.784044 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.798247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.798280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.798291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.798304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.798312 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.811396 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.815027 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.815054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.815064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.815080 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.815094 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.829090 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.832274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.832300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.832310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.832325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.832334 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.849090 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.853312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.853628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.853724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.853814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.853910 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.866771 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.870316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.870601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.870798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.870986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.871184 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.888117 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:35Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:35 crc kubenswrapper[4749]: E1129 01:11:35.888292 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.890640 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.890683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.890695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.890719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.890732 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.993330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.993378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.993387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.993404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:35 crc kubenswrapper[4749]: I1129 01:11:35.993413 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:35Z","lastTransitionTime":"2025-11-29T01:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.073986 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:36 crc kubenswrapper[4749]: E1129 01:11:36.074245 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.095947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.095987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.095996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.096011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.096020 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.199011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.199115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.199142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.199181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.199248 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.302378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.302444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.302479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.302509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.302534 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.401814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:36 crc kubenswrapper[4749]: E1129 01:11:36.401979 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:36 crc kubenswrapper[4749]: E1129 01:11:36.402026 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs podName:2bba1226-0e27-4cea-9eaa-d653f2061ec1 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:38.402013749 +0000 UTC m=+41.574163606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs") pod "network-metrics-daemon-nczdn" (UID: "2bba1226-0e27-4cea-9eaa-d653f2061ec1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.405497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.405529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.405537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.405551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.405562 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.509272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.509313 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.509322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.509337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.509347 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.612783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.612821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.612832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.612865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.612876 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.715686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.715743 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.715760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.715783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.715800 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.818148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.818181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.818189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.818222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.818262 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.921893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.921927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.921934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.921947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:36 crc kubenswrapper[4749]: I1129 01:11:36.921956 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:36Z","lastTransitionTime":"2025-11-29T01:11:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.024934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.024967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.024978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.024991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.025000 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.074866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:37 crc kubenswrapper[4749]: E1129 01:11:37.075006 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.075066 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.075117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:37 crc kubenswrapper[4749]: E1129 01:11:37.075301 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:37 crc kubenswrapper[4749]: E1129 01:11:37.075374 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.095419 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.116002 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.127825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.127883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.127904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.127931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.127951 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.137923 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.149736 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.166999 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.190701 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.210612 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.225974 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.231435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.231665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.231817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.231986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.232173 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.252014 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.268495 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.287883 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.304843 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.322082 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.334873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.334954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.334983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.335016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.335042 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.340248 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.355593 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.368684 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.391417 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:37Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.437335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.437391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.437403 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.437424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.437437 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.540029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.540072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.540084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.540100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.540112 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.644456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.644527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.644546 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.644574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.644593 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.748442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.748510 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.748529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.748556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.748577 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.852861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.852960 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.853005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.853034 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.853055 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.956373 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.956488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.956506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.956531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:37 crc kubenswrapper[4749]: I1129 01:11:37.956548 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:37Z","lastTransitionTime":"2025-11-29T01:11:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.060098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.060176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.060240 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.060273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.060295 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.074659 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:38 crc kubenswrapper[4749]: E1129 01:11:38.074929 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.163397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.163703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.163791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.163880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.163970 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.267983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.268046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.268065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.268096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.268116 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.371719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.371787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.371814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.371847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.371888 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.426795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:38 crc kubenswrapper[4749]: E1129 01:11:38.427071 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:38 crc kubenswrapper[4749]: E1129 01:11:38.427185 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs podName:2bba1226-0e27-4cea-9eaa-d653f2061ec1 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:42.427160222 +0000 UTC m=+45.599310089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs") pod "network-metrics-daemon-nczdn" (UID: "2bba1226-0e27-4cea-9eaa-d653f2061ec1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.475849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.475920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.475944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.476040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.476076 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.580444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.580517 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.580540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.580576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.580604 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.683770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.683856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.683895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.683929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.683952 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.787357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.787431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.787460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.787491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.787511 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.891174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.891272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.891294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.891326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.891348 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.995570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.995645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.995668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.995729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:38 crc kubenswrapper[4749]: I1129 01:11:38.995753 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:38Z","lastTransitionTime":"2025-11-29T01:11:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.074857 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:39 crc kubenswrapper[4749]: E1129 01:11:39.075106 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.074876 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.075345 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:39 crc kubenswrapper[4749]: E1129 01:11:39.075541 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:39 crc kubenswrapper[4749]: E1129 01:11:39.075930 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.099463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.099520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.099539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.099568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.099624 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.203879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.204511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.204532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.204564 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.204586 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.308971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.309040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.309060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.309088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.309106 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.412728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.412813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.412831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.412862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.412888 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.516545 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.516625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.516645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.516675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.516695 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.620310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.620392 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.620413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.620446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.620468 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.723649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.723732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.723754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.723816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.723835 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.827189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.827431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.827498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.827566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.827626 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.930871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.930936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.930948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.930964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:39 crc kubenswrapper[4749]: I1129 01:11:39.930976 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:39Z","lastTransitionTime":"2025-11-29T01:11:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.035442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.035751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.035852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.035954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.036046 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.075068 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:40 crc kubenswrapper[4749]: E1129 01:11:40.075439 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.140012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.140093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.140114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.140147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.140167 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.242719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.242795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.242816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.242845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.242867 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.346957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.347004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.347015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.347030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.347041 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.449556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.449604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.449616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.449634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.449647 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.552741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.552803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.552818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.552834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.552844 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.656751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.656837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.656857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.656901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.656922 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.760389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.760782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.760994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.761243 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.761469 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.864172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.864343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.864361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.864385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.864406 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.967525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.967582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.967599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.967624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:40 crc kubenswrapper[4749]: I1129 01:11:40.967641 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:40Z","lastTransitionTime":"2025-11-29T01:11:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.071098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.071162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.071180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.071246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.071266 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.075092 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.075152 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.075239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:41 crc kubenswrapper[4749]: E1129 01:11:41.075245 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:41 crc kubenswrapper[4749]: E1129 01:11:41.075421 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:41 crc kubenswrapper[4749]: E1129 01:11:41.075574 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.174777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.174871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.174916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.174956 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.174979 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.279311 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.279373 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.279385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.279406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.279419 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.382850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.383381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.383543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.383723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.383886 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.492090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.492169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.492193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.492258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.492285 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.595823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.595877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.595894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.595915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.595929 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.699080 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.699132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.699144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.699162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.699173 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.803318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.803385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.803418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.803441 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.803453 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.907836 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.907910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.907927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.907954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:41 crc kubenswrapper[4749]: I1129 01:11:41.907972 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:41Z","lastTransitionTime":"2025-11-29T01:11:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.011102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.011145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.011154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.011169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.011179 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.074714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:42 crc kubenswrapper[4749]: E1129 01:11:42.074924 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.113980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.114047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.114058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.114095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.114110 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.216558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.216657 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.216675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.216705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.216727 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.320057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.320128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.320146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.320175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.320223 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.423532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.423597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.423617 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.423648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.423671 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.476419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:42 crc kubenswrapper[4749]: E1129 01:11:42.476761 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:42 crc kubenswrapper[4749]: E1129 01:11:42.476894 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs podName:2bba1226-0e27-4cea-9eaa-d653f2061ec1 nodeName:}" failed. No retries permitted until 2025-11-29 01:11:50.476860094 +0000 UTC m=+53.649009981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs") pod "network-metrics-daemon-nczdn" (UID: "2bba1226-0e27-4cea-9eaa-d653f2061ec1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.526447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.526519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.526539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.526572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.526593 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.629780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.629859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.629882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.629913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.629935 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.732498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.732621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.732642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.732681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.732702 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.836316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.836385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.836404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.836439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.836464 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.940227 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.940310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.940332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.940364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:42 crc kubenswrapper[4749]: I1129 01:11:42.940384 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:42Z","lastTransitionTime":"2025-11-29T01:11:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.044274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.044354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.044378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.044408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.044427 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.074593 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.074645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.074619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:43 crc kubenswrapper[4749]: E1129 01:11:43.074899 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:43 crc kubenswrapper[4749]: E1129 01:11:43.074988 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:43 crc kubenswrapper[4749]: E1129 01:11:43.075238 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.147893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.147970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.147995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.148028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.148054 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.251846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.251940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.251971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.252010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.252036 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.356303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.356398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.356423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.356467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.356496 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.459984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.460069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.460091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.460124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.460146 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.563019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.563082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.563108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.563133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.563159 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.673291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.674119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.674210 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.674239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.674259 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.776385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.776446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.776464 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.776486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.776504 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.879823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.880325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.880514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.880689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.880827 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.983345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.983405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.983420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.983444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:43 crc kubenswrapper[4749]: I1129 01:11:43.983459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:43Z","lastTransitionTime":"2025-11-29T01:11:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.074363 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:44 crc kubenswrapper[4749]: E1129 01:11:44.074495 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.087114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.087146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.087156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.087172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.087182 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.189629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.189675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.189687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.189707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.189722 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.294425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.294503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.294528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.294557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.294576 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.398315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.398392 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.398422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.398454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.398476 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.501801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.501858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.501875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.501899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.501917 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.605969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.606040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.606064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.606092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.606116 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.709015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.709075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.709098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.709129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.709152 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.811710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.811761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.811770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.811784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.811793 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.913707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.913805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.913824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.913847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:44 crc kubenswrapper[4749]: I1129 01:11:44.913863 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:44Z","lastTransitionTime":"2025-11-29T01:11:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.016165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.016219 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.016227 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.016240 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.016249 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.074191 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.074266 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.074389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:45 crc kubenswrapper[4749]: E1129 01:11:45.074542 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:45 crc kubenswrapper[4749]: E1129 01:11:45.075261 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:45 crc kubenswrapper[4749]: E1129 01:11:45.075416 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.076249 4749 scope.go:117] "RemoveContainer" containerID="c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.119476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.119520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.119536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.119557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.119572 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.222609 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.222660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.222679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.222703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.222789 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.326690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.326750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.326765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.326787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.326802 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.416726 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/1.log" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.419636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.420120 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.429895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.429954 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.429970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.429999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.430017 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.439615 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.452779 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.467683 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.482525 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.496600 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.513063 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.532402 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.532873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.532912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.532926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.532945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.532960 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.557934 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.579833 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.607415 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.622275 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.635241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.635275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.635289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.635305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.635318 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.636542 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.648287 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.677585 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.690880 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.703354 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.713959 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:45Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.737562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.737608 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.737619 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.737647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.737660 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.839966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.840003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.840015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.840032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.840042 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.942343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.942376 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.942385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.942402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:45 crc kubenswrapper[4749]: I1129 01:11:45.942410 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:45Z","lastTransitionTime":"2025-11-29T01:11:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.044542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.044604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.044624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.044649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.044667 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.073968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.074082 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.147319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.147353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.147364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.147378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.147387 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.213550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.213594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.213605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.213620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.213632 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.228353 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.231947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.231989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.232006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.232029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.232046 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.245950 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.250375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.250431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.250455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.250483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.250504 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.270311 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.273710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.273769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.273785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.273811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.273830 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.297617 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.302600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.302652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.302670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.302695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.302712 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.322184 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.322442 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.324542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.324574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.324584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.324598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.324609 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.425663 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/2.log" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.426548 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.426580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.426591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.426607 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.426618 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.426659 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/1.log" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.431922 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60" exitCode=1 Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.431994 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.432075 4749 scope.go:117] "RemoveContainer" containerID="c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.433361 4749 scope.go:117] "RemoveContainer" containerID="3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60" Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.433644 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.457128 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.475842 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.494147 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.526545 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.529108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.529342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.529379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.529429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.529455 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.549540 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.573219 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.592763 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.632713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.632779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.632796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.632824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.632844 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.636223 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.656463 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.673895 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.691820 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.711609 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.730283 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.735133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.735177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.735235 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.735268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.735285 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.749352 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.774022 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.789375 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.805419 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:46Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.827066 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.827350 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:12:18.827309412 +0000 UTC m=+81.999459359 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.837580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.837666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.837683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.837708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.837728 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.929344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.929464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.929562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.929620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.929738 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.929793 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.929813 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.929904 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 01:12:18.929871931 +0000 UTC m=+82.102021818 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.929744 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.929903 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.930016 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.930091 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.930141 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.929988 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:12:18.929966403 +0000 UTC m=+82.102116290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.930182 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:12:18.930164118 +0000 UTC m=+82.102314015 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:11:46 crc kubenswrapper[4749]: E1129 01:11:46.930314 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 01:12:18.930273181 +0000 UTC m=+82.102423088 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.940397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.940459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.940482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.940512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:46 crc kubenswrapper[4749]: I1129 01:11:46.940534 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:46Z","lastTransitionTime":"2025-11-29T01:11:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.043350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.043433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.043459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.043491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.043519 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.074708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.074716 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.074818 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:47 crc kubenswrapper[4749]: E1129 01:11:47.075142 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:47 crc kubenswrapper[4749]: E1129 01:11:47.075305 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:47 crc kubenswrapper[4749]: E1129 01:11:47.075366 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.098290 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.121071 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.139893 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.147038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.147092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.147141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.147170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.147193 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.170224 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9c10e89580852c07cc2f9f469aebff58ebcd90dccbcd07d990782ae361726be\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:31Z\\\",\\\"message\\\":\\\"neAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184896 6245 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1129 01:11:31.184957 6245 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.184996 6245 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185101 6245 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185357 6245 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1129 01:11:31.185670 6245 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:31.185816 6245 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.189720 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.210452 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.227554 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.249977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.250020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.250029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.250043 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.250053 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.252857 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.266364 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.278074 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.288007 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.301643 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.314795 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.326448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.341869 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.353633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.353704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.353807 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.354010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.354056 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.354864 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.371236 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.437123 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/2.log" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.441566 4749 scope.go:117] "RemoveContainer" containerID="3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60" Nov 29 01:11:47 crc kubenswrapper[4749]: E1129 01:11:47.441755 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.457260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.457291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.457301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.457315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.457328 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.458336 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.473686 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.484726 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.499560 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.514319 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.527613 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.546134 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.560328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.560404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.560424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.560458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.560481 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.569243 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.589710 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.617375 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.634162 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.648072 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.663425 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.665053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.665079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.665088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.665105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.665115 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.688301 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.706923 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.720699 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.740773 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:47Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.767653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.767692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.767708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.767729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.767746 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.869958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.870000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.870012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.870028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.870038 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.972761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.972828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.972845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.972875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:47 crc kubenswrapper[4749]: I1129 01:11:47.972897 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:47Z","lastTransitionTime":"2025-11-29T01:11:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.073956 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:48 crc kubenswrapper[4749]: E1129 01:11:48.074132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.075606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.075636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.075644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.075658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.075668 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.178607 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.178689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.178709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.178740 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.178760 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.281625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.281709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.281745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.281781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.281803 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.386051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.386143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.386163 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.386198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.386251 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.489531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.489604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.489621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.489649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.489666 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.593152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.593286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.593308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.593355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.593376 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.697714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.697781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.697807 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.697841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.697866 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.800589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.800652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.800663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.800701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.800715 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.903125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.903199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.903253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.903283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:48 crc kubenswrapper[4749]: I1129 01:11:48.903310 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:48Z","lastTransitionTime":"2025-11-29T01:11:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.006769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.006901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.006922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.006955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.006979 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.074029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:49 crc kubenswrapper[4749]: E1129 01:11:49.074232 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.074248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.074348 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:49 crc kubenswrapper[4749]: E1129 01:11:49.074387 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:49 crc kubenswrapper[4749]: E1129 01:11:49.074580 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.109469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.109566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.109622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.109648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.109669 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.212642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.212719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.212739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.212764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.212786 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.315177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.315257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.315267 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.315283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.315294 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.418269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.418339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.418350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.418381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.418395 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.522360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.522425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.522439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.522459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.522474 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.625895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.625980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.626007 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.626041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.626067 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.730679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.730758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.730777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.730808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.730828 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.834493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.834551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.834569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.834591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.834609 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.937405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.937472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.937489 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.937512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.937530 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:49Z","lastTransitionTime":"2025-11-29T01:11:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.962097 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.983655 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 29 01:11:49 crc kubenswrapper[4749]: I1129 01:11:49.985775 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:49Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.023281 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.041344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.041416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.041441 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.041468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.041490 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.048802 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.070719 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.074857 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:50 crc kubenswrapper[4749]: E1129 01:11:50.075158 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.095404 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.118415 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.136319 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.144116 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.144180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.144240 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.144279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.144304 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.160141 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.176571 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.196124 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.223840 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.247126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.247493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.247582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.247680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.247793 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.255177 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.274581 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.289009 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.349611 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.349869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.349953 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.349972 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.350012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.350034 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.364689 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.380003 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:50Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.451729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.451831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.451860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.451893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.451914 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.555631 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.555683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.555698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.555722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.555740 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.571395 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:50 crc kubenswrapper[4749]: E1129 01:11:50.571582 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:50 crc kubenswrapper[4749]: E1129 01:11:50.571645 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs podName:2bba1226-0e27-4cea-9eaa-d653f2061ec1 nodeName:}" failed. No retries permitted until 2025-11-29 01:12:06.571629242 +0000 UTC m=+69.743779119 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs") pod "network-metrics-daemon-nczdn" (UID: "2bba1226-0e27-4cea-9eaa-d653f2061ec1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.660307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.660690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.660922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.661113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.661319 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.765939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.766048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.766071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.766137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.766254 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.871474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.872376 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.872593 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.872766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.872947 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.976677 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.976725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.976737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.976756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:50 crc kubenswrapper[4749]: I1129 01:11:50.976770 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:50Z","lastTransitionTime":"2025-11-29T01:11:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.074506 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.074663 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.074660 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:51 crc kubenswrapper[4749]: E1129 01:11:51.075286 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:51 crc kubenswrapper[4749]: E1129 01:11:51.074985 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:51 crc kubenswrapper[4749]: E1129 01:11:51.075530 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.080718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.080782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.080796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.080812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.080853 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.183821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.184282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.184303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.184328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.184348 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.288054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.288141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.288164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.288193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.288252 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.391281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.391354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.391379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.391412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.391440 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.494182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.494270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.494292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.494319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.494337 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.597070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.597110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.597125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.597146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.597160 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.700667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.700717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.700729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.700752 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.700768 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.803372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.803420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.803434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.803453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.803466 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.906334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.906384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.906397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.906412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:51 crc kubenswrapper[4749]: I1129 01:11:51.906423 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:51Z","lastTransitionTime":"2025-11-29T01:11:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.009099 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.009175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.009258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.009282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.009292 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.074435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:52 crc kubenswrapper[4749]: E1129 01:11:52.074596 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.111913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.111971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.111984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.112002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.112015 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.215535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.215574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.215583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.215601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.215610 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.317774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.317806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.317814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.317827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.317836 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.420943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.421339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.421435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.421523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.421604 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.523375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.523445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.523456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.523475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.523488 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.626812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.626859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.626872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.626889 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.626902 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.730041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.730085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.730097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.730135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.730149 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.832626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.832670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.832686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.832701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.832713 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.935288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.935345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.935356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.935375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:52 crc kubenswrapper[4749]: I1129 01:11:52.935388 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:52Z","lastTransitionTime":"2025-11-29T01:11:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.037727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.037774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.037783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.037802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.037815 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.074574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.074600 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:53 crc kubenswrapper[4749]: E1129 01:11:53.074703 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.074765 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:53 crc kubenswrapper[4749]: E1129 01:11:53.074840 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:53 crc kubenswrapper[4749]: E1129 01:11:53.075003 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.139817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.139858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.139867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.139881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.139890 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.241846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.241886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.241900 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.241918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.241930 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.344246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.344287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.344296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.344339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.344351 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.446740 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.446773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.446781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.446794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.446802 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.548877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.548918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.548926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.548939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.548948 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.650994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.651028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.651037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.651048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.651057 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.753437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.753463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.753470 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.753482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.753492 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.855440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.855471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.855480 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.855494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.855505 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.957841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.957866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.957874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.957886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:53 crc kubenswrapper[4749]: I1129 01:11:53.957895 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:53Z","lastTransitionTime":"2025-11-29T01:11:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.060647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.060695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.060708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.060724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.060736 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.074225 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:54 crc kubenswrapper[4749]: E1129 01:11:54.074344 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.162989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.163077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.163101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.163130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.163151 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.265932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.266005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.266029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.266102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.266126 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.369118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.369184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.369245 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.369281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.369304 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.473733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.473785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.473804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.473828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.473847 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.577091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.577135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.577147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.577163 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.577173 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.680114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.680148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.680156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.680170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.680181 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.783078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.783116 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.783124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.783139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.783152 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.884994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.885033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.885042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.885057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.885068 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.988384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.988432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.988442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.988456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:54 crc kubenswrapper[4749]: I1129 01:11:54.988465 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:54Z","lastTransitionTime":"2025-11-29T01:11:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.074952 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.075047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.074970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:55 crc kubenswrapper[4749]: E1129 01:11:55.075293 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:55 crc kubenswrapper[4749]: E1129 01:11:55.075466 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:55 crc kubenswrapper[4749]: E1129 01:11:55.075619 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.091005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.091090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.091105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.091127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.091144 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.193582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.193608 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.193616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.193628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.193635 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.296798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.296848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.296861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.296878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.296889 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.399582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.399689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.399706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.399726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.399741 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.502596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.502649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.502660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.502678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.502692 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.612899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.612944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.612955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.612971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.612981 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.715689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.715726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.715737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.715754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.715767 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.818435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.818578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.818591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.818604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.818636 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.920701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.920754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.920762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.920777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:55 crc kubenswrapper[4749]: I1129 01:11:55.920789 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:55Z","lastTransitionTime":"2025-11-29T01:11:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.022712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.022770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.022778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.022792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.022802 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.075074 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:56 crc kubenswrapper[4749]: E1129 01:11:56.075352 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.125700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.125754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.125765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.125783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.125795 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.229239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.229334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.229353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.229385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.229405 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.331890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.331947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.331958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.331976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.331988 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.434773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.434895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.434917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.434951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.434975 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.537666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.537705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.537717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.537733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.537746 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.639697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.639748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.639766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.639789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.639808 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: E1129 01:11:56.654026 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:56Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.658291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.658329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.658368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.658381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.658391 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: E1129 01:11:56.673404 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:56Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.678047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.678101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.678116 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.678130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.678139 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: E1129 01:11:56.691847 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:56Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.695733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.695776 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.695785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.695801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.695811 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: E1129 01:11:56.715086 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:56Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.720475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.720547 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.720559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.720573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.720602 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: E1129 01:11:56.740883 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:56Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:56 crc kubenswrapper[4749]: E1129 01:11:56.740997 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.743263 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.743292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.743301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.743315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.743328 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.845132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.845178 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.845190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.845240 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.845252 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.948159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.948220 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.948243 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.948260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:56 crc kubenswrapper[4749]: I1129 01:11:56.948269 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:56Z","lastTransitionTime":"2025-11-29T01:11:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.051462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.051503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.051514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.051534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.051546 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.074575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.074774 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:57 crc kubenswrapper[4749]: E1129 01:11:57.074856 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:57 crc kubenswrapper[4749]: E1129 01:11:57.075026 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.075389 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:57 crc kubenswrapper[4749]: E1129 01:11:57.075520 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.096112 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.110281 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.125287 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.140882 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.154143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.154180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.154190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.154223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.154233 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.154479 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.169622 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.198359 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.212625 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.224609 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.235407 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.250669 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.257306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.257351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.257364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.257389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.257407 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.265987 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.277051 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.285960 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.295301 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.313592 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.324889 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"334b4646-8d9a-4e00-8743-dd70b1358c6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.338915 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:11:57Z is after 2025-08-24T17:21:41Z" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.360536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.360563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.360572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.360591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.360602 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.463175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.463246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.463256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.463276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.463286 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.565465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.565728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.565814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.565898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.565964 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.668761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.669013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.669024 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.669036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.669045 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.771373 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.771421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.771431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.771448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.771459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.874228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.874268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.874276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.874289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.874300 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.976515 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.976552 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.976562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.976577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:57 crc kubenswrapper[4749]: I1129 01:11:57.976588 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:57Z","lastTransitionTime":"2025-11-29T01:11:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.074004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:11:58 crc kubenswrapper[4749]: E1129 01:11:58.074467 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.074670 4749 scope.go:117] "RemoveContainer" containerID="3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60" Nov 29 01:11:58 crc kubenswrapper[4749]: E1129 01:11:58.074821 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.079026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.079067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.079085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.079100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.079120 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.181855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.181896 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.181906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.181922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.181931 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.285571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.285614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.285627 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.285644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.285657 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.388084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.388139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.388151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.388170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.388183 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.490167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.490264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.490289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.490318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.490341 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.594228 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.594326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.594349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.594377 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.594399 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.698233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.698307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.698325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.698349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.698367 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.800873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.800928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.800946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.800970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.800988 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.903979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.904020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.904029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.904043 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:58 crc kubenswrapper[4749]: I1129 01:11:58.904052 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:58Z","lastTransitionTime":"2025-11-29T01:11:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.007496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.007550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.007562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.007587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.007601 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.074870 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.074881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:11:59 crc kubenswrapper[4749]: E1129 01:11:59.075026 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:11:59 crc kubenswrapper[4749]: E1129 01:11:59.075272 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.075807 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:11:59 crc kubenswrapper[4749]: E1129 01:11:59.076002 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.112062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.112130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.112149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.112174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.112194 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.215974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.216040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.216062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.216092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.216112 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.319706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.319768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.319791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.319822 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.319843 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.422784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.423098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.423189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.423312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.423398 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.525801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.525834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.525842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.525854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.525862 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.628788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.628818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.628827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.628867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.628879 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.731231 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.731331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.731352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.731388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.731408 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.833737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.834007 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.834133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.834282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.834387 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.937388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.937425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.937435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.937450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:11:59 crc kubenswrapper[4749]: I1129 01:11:59.937459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:11:59Z","lastTransitionTime":"2025-11-29T01:11:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.040047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.040092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.040104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.040121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.040134 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.074854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:00 crc kubenswrapper[4749]: E1129 01:12:00.074952 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.142715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.142748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.142757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.142775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.142785 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.245787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.245832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.245846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.245864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.245873 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.348411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.348443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.348456 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.348472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.348484 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.451082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.451112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.451121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.451134 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.451142 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.554049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.554096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.554112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.554131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.554145 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.656406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.656460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.656471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.656488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.656499 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.758695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.758726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.758735 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.758747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.758757 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.861233 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.861276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.861287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.861303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.861315 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.963874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.963934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.963947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.963963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:00 crc kubenswrapper[4749]: I1129 01:12:00.963974 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:00Z","lastTransitionTime":"2025-11-29T01:12:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.067011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.067052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.067061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.067073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.067081 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.074464 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.074541 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.074693 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:01 crc kubenswrapper[4749]: E1129 01:12:01.074687 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:01 crc kubenswrapper[4749]: E1129 01:12:01.074788 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:01 crc kubenswrapper[4749]: E1129 01:12:01.074838 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.169750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.169797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.169806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.169825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.169837 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.272322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.272379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.272391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.272411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.272427 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.375424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.375500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.375522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.375551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.375571 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.478321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.478360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.478370 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.478386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.478396 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.581525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.581560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.581572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.581586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.581596 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.685106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.685166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.685180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.685223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.685236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.788438 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.788496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.788510 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.788529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.788547 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.891581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.891652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.891671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.891707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.891729 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.995467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.995713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.995784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.995816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:01 crc kubenswrapper[4749]: I1129 01:12:01.995841 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:01Z","lastTransitionTime":"2025-11-29T01:12:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.074426 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:02 crc kubenswrapper[4749]: E1129 01:12:02.074704 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.099594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.099694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.099722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.099762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.099787 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.202802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.202882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.202905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.202936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.202961 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.306213 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.306281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.306295 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.306321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.306519 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.409700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.409781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.409810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.409841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.409862 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.512708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.512741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.512749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.512765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.512775 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.615599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.615662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.615681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.615710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.615731 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.719446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.719519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.719536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.719563 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.719588 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.823280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.823352 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.823371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.823402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.823424 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.926708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.926810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.926833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.926862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:02 crc kubenswrapper[4749]: I1129 01:12:02.926882 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:02Z","lastTransitionTime":"2025-11-29T01:12:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.029497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.029553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.029571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.029597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.029616 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.074430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:03 crc kubenswrapper[4749]: E1129 01:12:03.074595 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.074436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.074649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:03 crc kubenswrapper[4749]: E1129 01:12:03.074875 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:03 crc kubenswrapper[4749]: E1129 01:12:03.075045 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.133070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.133135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.133159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.133188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.133230 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.236878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.236953 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.236978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.237010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.237032 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.340395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.340462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.340478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.340503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.340520 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.444394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.444498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.444519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.444554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.444573 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.548083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.548169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.548223 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.548264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.548292 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.651846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.651928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.651945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.651975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.651993 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.755664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.755739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.755752 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.755775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.755794 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.859453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.859528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.859548 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.859577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.859597 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.962443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.962504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.962515 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.962536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:03 crc kubenswrapper[4749]: I1129 01:12:03.962553 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:03Z","lastTransitionTime":"2025-11-29T01:12:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.065660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.065745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.065761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.065782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.065795 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.073995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:04 crc kubenswrapper[4749]: E1129 01:12:04.074227 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.167949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.167994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.168004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.168021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.168031 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.270535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.270622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.270639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.270681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.270697 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.373735 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.373820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.373846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.373880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.373909 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.476809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.476852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.476860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.476876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.476886 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.580277 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.580320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.580331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.580348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.580359 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.682491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.682557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.682574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.682598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.682617 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.786173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.786259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.786282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.786307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.786326 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.890854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.890901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.890909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.890926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.890939 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.993369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.993406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.993414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.993427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:04 crc kubenswrapper[4749]: I1129 01:12:04.993436 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:04Z","lastTransitionTime":"2025-11-29T01:12:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.073966 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.073994 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.074060 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:05 crc kubenswrapper[4749]: E1129 01:12:05.074293 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:05 crc kubenswrapper[4749]: E1129 01:12:05.074376 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:05 crc kubenswrapper[4749]: E1129 01:12:05.074430 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.096681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.096721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.096730 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.096765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.096776 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.199670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.199774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.199804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.199841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.199865 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.303526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.303569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.303580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.303595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.303606 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.413109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.413158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.413170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.413187 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.413212 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.516906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.516969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.517013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.517036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.517049 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.621126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.621279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.621313 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.621348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.621376 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.725288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.725346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.725357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.725379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.725393 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.829577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.829691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.829711 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.829738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.829760 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.932474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.932509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.932518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.932530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:05 crc kubenswrapper[4749]: I1129 01:12:05.932541 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:05Z","lastTransitionTime":"2025-11-29T01:12:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.036398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.036443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.036454 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.036471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.036483 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.074295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:06 crc kubenswrapper[4749]: E1129 01:12:06.074527 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.138901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.138960 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.138979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.139004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.139022 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.241962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.242041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.242065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.242099 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.242124 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.344453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.344485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.344494 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.344508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.344518 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.447148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.447189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.447218 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.447236 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.447247 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.549413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.549436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.549446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.549458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.549467 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.651785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.651842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.651859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.651883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.651900 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.652432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:06 crc kubenswrapper[4749]: E1129 01:12:06.652720 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:12:06 crc kubenswrapper[4749]: E1129 01:12:06.652875 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs podName:2bba1226-0e27-4cea-9eaa-d653f2061ec1 nodeName:}" failed. No retries permitted until 2025-11-29 01:12:38.652842747 +0000 UTC m=+101.824992794 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs") pod "network-metrics-daemon-nczdn" (UID: "2bba1226-0e27-4cea-9eaa-d653f2061ec1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.754490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.754523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.754532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.754546 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.754556 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.857297 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.857369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.857389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.857414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.857432 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.952981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.953066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.953088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.953125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.953145 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: E1129 01:12:06.971501 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:06Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.976378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.976423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.976442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.976465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:06 crc kubenswrapper[4749]: I1129 01:12:06.976480 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:06Z","lastTransitionTime":"2025-11-29T01:12:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:06 crc kubenswrapper[4749]: E1129 01:12:06.994746 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:06Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.000829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.000871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.000883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.000902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.000916 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: E1129 01:12:07.018399 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.023265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.023319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.023333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.023347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.023361 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: E1129 01:12:07.037681 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.047909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.047951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.047962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.048002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.048013 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: E1129 01:12:07.069374 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: E1129 01:12:07.069572 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.071868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.071912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.071930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.071955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.071978 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.078961 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:07 crc kubenswrapper[4749]: E1129 01:12:07.079106 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.080469 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:07 crc kubenswrapper[4749]: E1129 01:12:07.081394 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.080475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:07 crc kubenswrapper[4749]: E1129 01:12:07.081494 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.108145 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.140857 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.157048 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.169886 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"334b4646-8d9a-4e00-8743-dd70b1358c6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.174527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.174556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.174566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.174599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.174609 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.183048 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.197252 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.212099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.233664 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.246114 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.258221 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.273281 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.276736 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.276795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.276815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.276847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.276865 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.282547 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.293346 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.312008 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.324709 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.334311 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.351988 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.368126 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:07Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.379068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.379108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.379120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.379140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.379152 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.482411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.482475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.482495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.482520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.482539 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.585903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.585976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.585991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.586013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.586025 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.688249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.688288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.688302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.688321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.688333 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.790772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.790806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.790816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.790831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.790841 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.892867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.892904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.892912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.892926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.892937 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.996497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.996536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.996548 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.996561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:07 crc kubenswrapper[4749]: I1129 01:12:07.996570 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:07Z","lastTransitionTime":"2025-11-29T01:12:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.074837 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:08 crc kubenswrapper[4749]: E1129 01:12:08.075003 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.098898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.098943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.098955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.098971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.098981 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.201820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.201884 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.201902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.201926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.201979 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.305753 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.305869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.305890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.305917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.305938 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.408887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.408965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.409005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.409040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.409064 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.510692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.510755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.510776 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.510801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.510820 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.511046 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/0.log" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.511107 4749 generic.go:334] "Generic (PLEG): container finished" podID="454ec33e-9530-4cf0-ad08-9c3a21b0e56b" containerID="57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583" exitCode=1 Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.511139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7g" event={"ID":"454ec33e-9530-4cf0-ad08-9c3a21b0e56b","Type":"ContainerDied","Data":"57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.511624 4749 scope.go:117] "RemoveContainer" containerID="57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.540273 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.558683 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"334b4646-8d9a-4e00-8743-dd70b1358c6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.574711 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"2025-11-29T01:11:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10\\\\n2025-11-29T01:11:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10 to /host/opt/cni/bin/\\\\n2025-11-29T01:11:22Z [verbose] multus-daemon started\\\\n2025-11-29T01:11:22Z [verbose] Readiness Indicator file check\\\\n2025-11-29T01:12:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.592170 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.608503 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.615460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.615701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.615723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.615755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.615774 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.623935 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.639550 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.662414 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.680356 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.700419 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.718257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.718504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.718721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.718990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.719185 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.721628 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.739782 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.756715 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.772178 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.794992 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.815946 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.822703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.823011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.823167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.823349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.823546 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.834193 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.856144 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:08Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.927790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.927839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.927849 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.927865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:08 crc kubenswrapper[4749]: I1129 01:12:08.927875 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:08Z","lastTransitionTime":"2025-11-29T01:12:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.030327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.030381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.030393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.030410 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.030423 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.074378 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.074434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.074501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:09 crc kubenswrapper[4749]: E1129 01:12:09.074587 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:09 crc kubenswrapper[4749]: E1129 01:12:09.074721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:09 crc kubenswrapper[4749]: E1129 01:12:09.074803 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.133574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.133656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.133673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.133716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.133737 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.237163 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.237253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.237268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.237287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.237301 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.339970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.340011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.340056 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.340072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.340081 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.442014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.442054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.442063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.442075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.442084 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.515567 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/0.log" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.515650 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7g" event={"ID":"454ec33e-9530-4cf0-ad08-9c3a21b0e56b","Type":"ContainerStarted","Data":"3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.528391 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.540571 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.544980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.545046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.545063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.545089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.545107 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.559900 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.572010 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.590592 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.611626 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.630872 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.646519 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.646973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.647009 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.647021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.647038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.647050 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.657026 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.678239 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.694784 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.712277 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.730273 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.746541 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.750133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.750209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.750221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.750241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.750256 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.767639 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.781576 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"334b4646-8d9a-4e00-8743-dd70b1358c6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.804589 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"2025-11-29T01:11:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10\\\\n2025-11-29T01:11:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10 to /host/opt/cni/bin/\\\\n2025-11-29T01:11:22Z [verbose] multus-daemon started\\\\n2025-11-29T01:11:22Z [verbose] Readiness Indicator file check\\\\n2025-11-29T01:12:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.815772 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:09Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.852591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.852648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.852665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.852687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.852704 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.956572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.957047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.957078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.957113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:09 crc kubenswrapper[4749]: I1129 01:12:09.957237 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:09Z","lastTransitionTime":"2025-11-29T01:12:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.060401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.060452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.060465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.060485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.060498 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.074955 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:10 crc kubenswrapper[4749]: E1129 01:12:10.075133 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.162670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.162722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.162739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.162761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.162781 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.266418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.266487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.266512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.266544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.266569 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.369445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.369514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.369532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.369556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.369575 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.473093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.473161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.473184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.473246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.473271 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.576114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.576238 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.576261 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.576284 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.576302 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.679741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.679815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.679835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.679866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.679883 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.782379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.782468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.782493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.782526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.782551 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.885983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.886038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.886054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.886075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.886090 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.988704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.988796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.988812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.988834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:10 crc kubenswrapper[4749]: I1129 01:12:10.988853 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:10Z","lastTransitionTime":"2025-11-29T01:12:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.074044 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.074048 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.074327 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:11 crc kubenswrapper[4749]: E1129 01:12:11.074487 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:11 crc kubenswrapper[4749]: E1129 01:12:11.074688 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:11 crc kubenswrapper[4749]: E1129 01:12:11.074730 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.075873 4749 scope.go:117] "RemoveContainer" containerID="3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.091901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.091970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.091988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.092012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.092030 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.194840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.194902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.194919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.194943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.194962 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.296588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.296612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.296620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.296632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.296640 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.399221 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.399247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.399256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.399269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.399292 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.502237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.502274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.502284 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.502300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.502310 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.525804 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/2.log" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.529545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.531051 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.557022 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.579572 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.592943 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.604171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.604243 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.604253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.604266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.604274 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.608659 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.619561 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.631661 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.645855 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.657827 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.668997 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.688429 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.701107 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.706753 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.706788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.706798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.706814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.706825 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.725341 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.739879 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.756725 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.769877 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"334b4646-8d9a-4e00-8743-dd70b1358c6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.782669 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"2025-11-29T01:11:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10\\\\n2025-11-29T01:11:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10 to /host/opt/cni/bin/\\\\n2025-11-29T01:11:22Z [verbose] multus-daemon started\\\\n2025-11-29T01:11:22Z [verbose] Readiness Indicator file check\\\\n2025-11-29T01:12:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.793299 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.803502 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.809192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.809266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.809281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.809298 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.809310 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.911526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.911555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.911562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.911574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:11 crc kubenswrapper[4749]: I1129 01:12:11.911583 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:11Z","lastTransitionTime":"2025-11-29T01:12:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.014045 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.014093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.014101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.014114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.014123 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.075016 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:12 crc kubenswrapper[4749]: E1129 01:12:12.075213 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.116502 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.116557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.116569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.116589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.116602 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.218652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.218686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.218695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.218708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.218717 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.320816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.320862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.320871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.320890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.320899 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.423917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.423968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.423977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.423992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.424002 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.527636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.527697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.527708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.527724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.527735 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.535837 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/3.log" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.536852 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/2.log" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.540376 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" exitCode=1 Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.540447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.540549 4749 scope.go:117] "RemoveContainer" containerID="3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.541562 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:12:12 crc kubenswrapper[4749]: E1129 01:12:12.541849 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.569945 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.586063 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.602331 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.616302 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.629181 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.630733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.630802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.630818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.630841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.630855 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.644728 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.658657 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.668843 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.689618 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3bffbdb83951217ff666b2b0563bc206026a29f9e86fa044e0068c1f15e3cc60\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:11:46Z\\\",\\\"message\\\":\\\"1:11:46.026385 6453 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1129 01:11:46.026398 6453 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1129 01:11:46.026426 6453 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1129 01:11:46.026440 6453 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1129 01:11:46.026495 6453 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1129 01:11:46.026519 6453 factory.go:656] Stopping watch factory\\\\nI1129 01:11:46.026539 6453 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1129 01:11:46.026549 6453 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1129 01:11:46.026557 6453 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1129 01:11:46.026566 6453 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1129 01:11:46.026575 6453 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1129 01:11:46.026828 6453 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1129 01:11:46.027441 6453 ovnkube.go:599] Stopped ovnkube\\\\nI1129 01:11:46.027490 6453 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1129 01:11:46.027555 6453 ovnkube.go:137] failed to run ovnkube: [failed to start network cont\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:12Z\\\",\\\"message\\\":\\\"red: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z]\\\\nI1129 01:12:11.930915 6802 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 01:12:11.930950 6802 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1129 01:12:11.930963 6802 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mnsct\\\\nI1129 01:12:11.930976 68\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:12:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.706036 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.724306 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.733175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.733235 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.733248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.733268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.733282 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.747388 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.764922 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.778567 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"334b4646-8d9a-4e00-8743-dd70b1358c6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.791655 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"2025-11-29T01:11:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10\\\\n2025-11-29T01:11:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10 to /host/opt/cni/bin/\\\\n2025-11-29T01:11:22Z [verbose] multus-daemon started\\\\n2025-11-29T01:11:22Z [verbose] Readiness Indicator file check\\\\n2025-11-29T01:12:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.802344 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.813938 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.833718 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:12Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.835030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.835078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.835097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.835120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.835138 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.937495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.937572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.937597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.937626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:12 crc kubenswrapper[4749]: I1129 01:12:12.937643 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:12Z","lastTransitionTime":"2025-11-29T01:12:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.040989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.041040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.041051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.041070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.041082 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.073984 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:13 crc kubenswrapper[4749]: E1129 01:12:13.074163 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.074293 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.074375 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:13 crc kubenswrapper[4749]: E1129 01:12:13.074616 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:13 crc kubenswrapper[4749]: E1129 01:12:13.074782 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.143773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.143834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.143853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.143876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.143893 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.247082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.247117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.247126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.247157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.247166 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.351519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.351596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.351620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.351660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.351683 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.455248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.455326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.455347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.455381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.455408 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.548440 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/3.log" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.555611 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:12:13 crc kubenswrapper[4749]: E1129 01:12:13.555964 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.559820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.559900 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.559916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.559940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.559962 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.587648 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.608856 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"334b4646-8d9a-4e00-8743-dd70b1358c6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.630026 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"2025-11-29T01:11:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10\\\\n2025-11-29T01:11:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10 to /host/opt/cni/bin/\\\\n2025-11-29T01:11:22Z [verbose] multus-daemon started\\\\n2025-11-29T01:11:22Z [verbose] Readiness Indicator file check\\\\n2025-11-29T01:12:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.646120 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.664550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.664588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.664601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.664623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.664638 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.666412 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.682946 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.704867 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.726505 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.743735 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.761112 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.767541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.767581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.767599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.767626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.767645 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.773882 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.790035 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.812009 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.828788 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.855565 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:12Z\\\",\\\"message\\\":\\\"red: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z]\\\\nI1129 01:12:11.930915 6802 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 01:12:11.930950 6802 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1129 01:12:11.930963 6802 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mnsct\\\\nI1129 01:12:11.930976 68\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:12:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.871239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.871296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.871314 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.871344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.871368 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.876138 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.898355 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.919845 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:13Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.975058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.975126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.975152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.975222 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:13 crc kubenswrapper[4749]: I1129 01:12:13.975255 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:13Z","lastTransitionTime":"2025-11-29T01:12:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.074340 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:14 crc kubenswrapper[4749]: E1129 01:12:14.074497 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.078690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.078734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.078746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.078761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.078773 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.181513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.181576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.181593 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.181656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.181678 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.284857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.284946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.284974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.285013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.285037 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.388584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.388684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.388725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.388764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.388783 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.492577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.492662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.492681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.492714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.492755 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.596340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.596405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.596419 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.596443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.596454 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.699785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.699855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.699882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.699920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.699943 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.803679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.803738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.803749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.803771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.803784 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.907708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.907782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.907804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.907836 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:14 crc kubenswrapper[4749]: I1129 01:12:14.907855 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:14Z","lastTransitionTime":"2025-11-29T01:12:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.010648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.010725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.010744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.010775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.010798 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.074463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.074480 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.074726 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:15 crc kubenswrapper[4749]: E1129 01:12:15.074886 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:15 crc kubenswrapper[4749]: E1129 01:12:15.075024 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:15 crc kubenswrapper[4749]: E1129 01:12:15.075309 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.112980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.113041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.113056 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.113084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.113099 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.215364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.215409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.215418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.215433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.215450 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.318778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.319052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.319117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.319182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.319270 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.421776 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.421812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.421820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.421833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.421842 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.524957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.525264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.525384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.525471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.525549 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.628139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.628229 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.628256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.628275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.628286 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.732762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.732856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.732879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.732913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.732937 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.836706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.836981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.837006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.837042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.837066 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.939772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.940136 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.940367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.940587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:15 crc kubenswrapper[4749]: I1129 01:12:15.940792 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:15Z","lastTransitionTime":"2025-11-29T01:12:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.044712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.044796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.044814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.044845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.044864 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.074193 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:16 crc kubenswrapper[4749]: E1129 01:12:16.074659 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.148948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.149044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.149065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.149102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.149128 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.253968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.254055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.254077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.254112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.254134 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.357384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.357451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.357471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.357500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.357519 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.460273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.460377 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.460404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.460443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.460468 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.565479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.565558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.565579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.565611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.565641 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.671492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.671572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.671593 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.671626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.671648 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.775582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.775655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.775676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.775744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.775766 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.879645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.879707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.879727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.879755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.879774 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.984275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.984361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.984385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.984515 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:16 crc kubenswrapper[4749]: I1129 01:12:16.984542 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:16Z","lastTransitionTime":"2025-11-29T01:12:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.074619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.074619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.074835 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.075027 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.075146 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.075914 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.086982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.087048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.087073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.087777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.087922 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.098482 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"699d46ba-63d2-4366-b76f-344c5ef7bcdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-29T01:11:14Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1129 01:11:09.553611 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1129 01:11:09.554751 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-951499295/tls.crt::/tmp/serving-cert-951499295/tls.key\\\\\\\"\\\\nI1129 01:11:14.821837 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1129 01:11:14.825421 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1129 01:11:14.825458 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1129 01:11:14.825491 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1129 01:11:14.825500 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1129 01:11:14.832235 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1129 01:11:14.832261 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1129 01:11:14.832286 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832300 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1129 01:11:14.832310 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1129 01:11:14.832317 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1129 01:11:14.832324 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1129 01:11:14.832332 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1129 01:11:14.836070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.101485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.102413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.102522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.102566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.102593 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.125971 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.130502 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50272ae8fb58860c18de50375d5f1da362f816c574a6e918c6f6db3336459458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.133672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.133781 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.133842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.133907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.134000 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.149138 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.150342 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.155553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.155656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.155679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.155711 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.155731 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.168706 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52dcfbe3-4017-41b0-a1a6-f117eb831499\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4824350e8accdfde1479969cd56bd21a1bcb092f001051f561a64bf799314ccb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7830b93fbe438c7e0c90f326e2386ac5a34ddcdd58ac5ff15ad3daf865b688b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://774fb6d987d19c1c82d9aada588bf61eff8ebd2954403b77271791de2afe227b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69af33f4a35ca6465faaf97adae0b3b9529023dba405de9d2a0c38450b6a3a8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://413c052c9cce931a07c9df7d3305d22d78f01878359aa7acc23b55e256cab1c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://189edb58b9a81b131938cf90319dfb83a2f2a419a84340358055e74bba118f9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90b06261a0a22e0013c881431972a3c52e4baf0a524f3d47670d953ca390280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4mr44\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wz6xx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.179640 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.185394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.185450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.185476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.185507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.185530 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.185648 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kr9qp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46bc9153-d89a-4dbe-a806-ae78091d27a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68072fe4cecef9b66b6b5a464a64a80f662308d67b55b698db1629cc8ecee0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcwxx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kr9qp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.203099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nczdn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bba1226-0e27-4cea-9eaa-d653f2061ec1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mtrb6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nczdn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.204600 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.209293 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.209439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.209510 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.209582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.209654 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.221717 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.222669 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0abfdf85-5794-49a6-a3fb-09e2a103db44\\\",\\\"systemUUID\\\":\\\"ffaab6e0-7081-491c-af88-2b486225a952\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: E1129 01:12:17.222914 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.225303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.225347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.225356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.225371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.225381 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.236243 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0453c99a4b3c0a184baeed09449b934de94a5dd509876fd40934ffbf02c560c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.251448 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"800b3936-ba93-47d8-9417-2fdc5ce4d171\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b12b073f873610fad22b7513ded0039569319bb4c0cca964c022c1f3c4c9eab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-28qfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mnsct\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.281707 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52d1a95a-c900-4842-82c4-5f4c37a16fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:12Z\\\",\\\"message\\\":\\\"red: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:11Z is after 2025-08-24T17:21:41Z]\\\\nI1129 01:12:11.930915 6802 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-marketplace/community-operators]} name:Service_openshift-marketplace/community-operators_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.189:50051:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d389393c-7ba9-422c-b3f5-06e391d537d2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1129 01:12:11.930950 6802 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1129 01:12:11.930963 6802 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-mnsct\\\\nI1129 01:12:11.930976 68\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:12:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:11:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pr2f7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-m7sg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.304244 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e27a24b-f764-47f3-a4f3-0b831a99eb48\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5953d49bd298f651a8eab54f88e18d5bf8c6521ef83aad70816b605f575d4138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac72633a4753acf8c8e976f08b13190097380be15b4a0c3dee925dddfd23373\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://12cb1553416112b7f154ebefe107d9bf9f328521069789c4d8b87fb259874ae7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.323113 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.328768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.328804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.328815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.328835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.328847 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.339430 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc01e04d10d0b65fc02ab50e9dccfcb237c5b31ee77f67208e553b9085cd5cc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0641785f769b67fca21936726adaddef4de3c95d71041a68fa0d21980bc3f582\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.359151 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"347964c8-98bb-48de-ba75-e692416c64da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac60a17819fae4060e5a6ff7db143fdfa25c5c4d87ba4d7f96138b9d5f94a456\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee199361bc981620d87db5fde34c4556e6d15aed21bec380470525d43626cfd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22bce7f3a2bf14e0034345acceece9155539a6876a798affe41fd7e3f3fd421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c29b2a2aaa980d09d7703bc8d566db27232376c52b06335749ee044e1a18dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bf36edb6891599bd0489413478ba537f2ad538c266bfee665b722964b3039d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://59f017b14b2282e312af6bfdd0a71f3ee038b0f9f5e2e67f3e1059616d6d1a6b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f16984ebf1cf5d04baa7e3b25c88f80d81d3078afa1818d645a7a5e6ae08f869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900b3777f32f13196dc8e6b54a4b0295778fca7f9a1a35b1a9efd82bd09dab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.374912 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"334b4646-8d9a-4e00-8743-dd70b1358c6d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0132b9736b8175d4650d3cbf4486e16474281c7c784f89a4c48dec29e1f64a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://854d3809743a74c0735eee206a5d7b9e3f6e26bd66d038d25cdaca434cd63592\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6972fa4545530f038079d351cbd2e4fb9c91bf99a6ec9e010a3e343979bf83bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:10:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f210aefb5e7e2377dc69a4ea4ccdbc01248ea745670ae221e8679e124c92954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-29T01:10:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-29T01:10:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:10:57Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.398310 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gf7g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"454ec33e-9530-4cf0-ad08-9c3a21b0e56b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:12:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-29T01:12:07Z\\\",\\\"message\\\":\\\"2025-11-29T01:11:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10\\\\n2025-11-29T01:11:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_bbd85ea4-4ee1-4321-ae8e-e3f7f2648a10 to /host/opt/cni/bin/\\\\n2025-11-29T01:11:22Z [verbose] multus-daemon started\\\\n2025-11-29T01:11:22Z [verbose] Readiness Indicator file check\\\\n2025-11-29T01:12:07Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:12:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gvvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gf7g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.412566 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zrr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d508d053-4b4d-472c-afa0-43a89560cdf7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0791ea0ff9e89b3a1828e1d912eb2d77c4fa9c73dde3f4a747eee70ac816a8fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kgbrm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zrr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.424013 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6486350-f678-4175-a288-633b1ff9365d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-29T01:11:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf6b4dbe5d16256d0ff60ed8ea9317c98ff7cf78d026b0c18373bbe79b4fb38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://691460af43e1eb95720fc5bfdca77649ac700f2af1ed3bf49d72d27b6dbb3745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-29T01:11:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fjlt2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-29T01:11:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wj9gn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-29T01:12:17Z is after 2025-08-24T17:21:41Z" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.431871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.431945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.431965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.431995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.432015 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.535828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.535896 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.535921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.535955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.535984 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.639542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.639615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.639630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.639654 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.639672 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.742924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.742974 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.742985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.743004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.743016 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.847164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.847294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.847327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.847359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.847379 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.950126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.950166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.950176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.950191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:17 crc kubenswrapper[4749]: I1129 01:12:17.950214 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:17Z","lastTransitionTime":"2025-11-29T01:12:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.053828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.053879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.053892 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.053914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.053926 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.074332 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.074569 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.157593 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.157647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.157657 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.157679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.157691 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.261060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.261113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.261135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.261155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.261168 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.364308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.364358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.364372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.364395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.364412 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.468190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.468268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.468278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.468319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.468331 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.572929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.572975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.572983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.573001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.573013 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.676989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.677071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.677092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.677121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.677140 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.780789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.781299 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.781325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.781360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.781386 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.886059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.886146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.886171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.886239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.886270 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.895695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.895904 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.895859841 +0000 UTC m=+146.068009748 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.989451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.989528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.989547 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.989579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.989600 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:18Z","lastTransitionTime":"2025-11-29T01:12:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.997646 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.997733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.997782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:18 crc kubenswrapper[4749]: I1129 01:12:18.997824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.997900 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.997966 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998048 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.998022624 +0000 UTC m=+146.170172791 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998071 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.998060785 +0000 UTC m=+146.170210902 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998089 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998147 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998177 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998096 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998276 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998298 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.99826982 +0000 UTC m=+146.170419717 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998302 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:12:18 crc kubenswrapper[4749]: E1129 01:12:18.998401 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.998372262 +0000 UTC m=+146.170522189 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.074498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:19 crc kubenswrapper[4749]: E1129 01:12:19.074682 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.074938 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:19 crc kubenswrapper[4749]: E1129 01:12:19.075001 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.075156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:19 crc kubenswrapper[4749]: E1129 01:12:19.075257 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.093098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.093176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.093247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.093305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.093327 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.198099 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.198148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.198158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.198175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.198187 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.302115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.302266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.302291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.302327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.302351 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.404524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.404582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.404598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.404626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.404643 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.508042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.508115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.508138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.508181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.508250 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.610466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.610516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.610525 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.610540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.610550 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.713704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.713889 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.713910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.714393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.714416 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.817744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.817809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.817827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.817854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.817872 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.920655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.920684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.920692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.920707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:19 crc kubenswrapper[4749]: I1129 01:12:19.920717 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:19Z","lastTransitionTime":"2025-11-29T01:12:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.028928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.029003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.029024 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.029057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.029082 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.074805 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:20 crc kubenswrapper[4749]: E1129 01:12:20.075009 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.132520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.132644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.132673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.132706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.132730 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.236060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.236134 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.236153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.236184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.236254 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.339916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.340000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.340026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.340061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.340088 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.444098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.444173 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.444227 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.444261 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.444282 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.548265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.548435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.548505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.548542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.548568 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.652112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.652188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.652238 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.652275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.652299 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.755856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.755919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.755936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.755966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.755989 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.859905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.859973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.859992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.860022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.860042 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.963274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.963318 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.963330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.963347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:20 crc kubenswrapper[4749]: I1129 01:12:20.963359 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:20Z","lastTransitionTime":"2025-11-29T01:12:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.066695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.066747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.066758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.066779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.066790 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.074568 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.074619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.074650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:21 crc kubenswrapper[4749]: E1129 01:12:21.074746 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:21 crc kubenswrapper[4749]: E1129 01:12:21.074847 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:21 crc kubenswrapper[4749]: E1129 01:12:21.074890 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.169620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.169684 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.169703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.169727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.169743 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.272627 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.272708 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.272737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.272771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.272795 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.375505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.375562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.375575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.375600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.375613 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.479658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.479728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.479748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.479787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.479813 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.583888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.584126 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.584145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.584167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.584184 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.717906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.717989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.718012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.718049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.718073 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.822038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.822139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.822159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.822257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.822281 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.925467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.925535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.925548 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.925572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:21 crc kubenswrapper[4749]: I1129 01:12:21.925596 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:21Z","lastTransitionTime":"2025-11-29T01:12:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.028431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.028471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.028487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.028507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.028520 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.074818 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:22 crc kubenswrapper[4749]: E1129 01:12:22.074981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.130878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.130930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.130948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.130975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.130993 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.234719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.234795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.234813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.234839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.234859 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.338692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.338795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.338819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.338852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.338873 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.442728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.442792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.442811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.442843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.442863 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.546765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.546832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.546854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.546882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.546904 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.650356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.650450 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.650473 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.650503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.650524 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.754316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.754384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.754404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.754431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.754448 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.858981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.859089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.859112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.859148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.859169 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.961797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.961873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.961901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.961931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:22 crc kubenswrapper[4749]: I1129 01:12:22.961951 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:22Z","lastTransitionTime":"2025-11-29T01:12:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.066945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.067330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.067541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.067787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.067935 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.074704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.074769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.074708 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:23 crc kubenswrapper[4749]: E1129 01:12:23.074871 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:23 crc kubenswrapper[4749]: E1129 01:12:23.074986 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:23 crc kubenswrapper[4749]: E1129 01:12:23.075224 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.172276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.172319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.172329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.172349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.172361 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.275734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.275799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.275817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.275843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.275864 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.380179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.380308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.380334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.380369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.380394 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.483551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.483641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.483670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.483704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.483732 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.586260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.586315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.586328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.586348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.586362 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.692118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.692172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.692187 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.692236 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.692253 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.794863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.794916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.794928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.794946 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.794959 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.897447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.897524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.897543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.897574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:23 crc kubenswrapper[4749]: I1129 01:12:23.897594 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:23Z","lastTransitionTime":"2025-11-29T01:12:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.000674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.000753 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.000783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.000823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.000851 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.074638 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:24 crc kubenswrapper[4749]: E1129 01:12:24.074838 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.104554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.104605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.104619 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.104640 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.104655 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.208761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.209267 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.209418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.209558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.209719 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.313279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.313348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.313367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.313395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.313416 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.416825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.416894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.416915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.416950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.416972 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.520485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.520519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.520530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.520553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.520564 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.624904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.624981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.625004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.625036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.625063 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.729535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.729680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.729701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.729734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.729753 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.833966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.834048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.834067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.834098 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.834121 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.937741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.937811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.937829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.937853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:24 crc kubenswrapper[4749]: I1129 01:12:24.937868 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:24Z","lastTransitionTime":"2025-11-29T01:12:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.041443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.041538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.041557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.041587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.041607 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.076378 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:25 crc kubenswrapper[4749]: E1129 01:12:25.076692 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.076930 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.077102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:25 crc kubenswrapper[4749]: E1129 01:12:25.077569 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:25 crc kubenswrapper[4749]: E1129 01:12:25.077869 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.145350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.145448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.145478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.145520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.145547 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.249830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.249930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.249950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.249982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.250002 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.353932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.354020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.354044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.354079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.354103 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.458328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.458395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.458414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.458447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.458476 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.562269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.562405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.562426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.562455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.562475 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.666275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.666340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.666359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.666384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.666403 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.769792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.769843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.769860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.769888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.769902 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.872698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.873306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.873553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.873786 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.874001 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.976856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.976902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.976917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.976934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:25 crc kubenswrapper[4749]: I1129 01:12:25.976946 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:25Z","lastTransitionTime":"2025-11-29T01:12:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.074733 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:26 crc kubenswrapper[4749]: E1129 01:12:26.075101 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.080351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.080402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.080411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.080429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.080441 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.182568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.182614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.182623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.182638 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.182652 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.285151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.285241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.285283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.285312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.285334 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.389071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.389610 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.389770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.389933 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.390087 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.492930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.493273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.493365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.493466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.493547 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.595998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.596042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.596058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.596078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.596094 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.698581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.698608 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.698615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.698628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.698636 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.800788 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.800827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.800841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.800858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.800869 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.903429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.903493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.903520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.903550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:26 crc kubenswrapper[4749]: I1129 01:12:26.903568 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:26Z","lastTransitionTime":"2025-11-29T01:12:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.006153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.006253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.006272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.006301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.006323 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:27Z","lastTransitionTime":"2025-11-29T01:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.075063 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.075127 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.075173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:27 crc kubenswrapper[4749]: E1129 01:12:27.075298 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:27 crc kubenswrapper[4749]: E1129 01:12:27.075317 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:27 crc kubenswrapper[4749]: E1129 01:12:27.075355 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.075967 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:12:27 crc kubenswrapper[4749]: E1129 01:12:27.076218 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.109419 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.109490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.109516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.109565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.109588 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:27Z","lastTransitionTime":"2025-11-29T01:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.109932 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.109912607 podStartE2EDuration="1m11.109912607s" podCreationTimestamp="2025-11-29 01:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.108412179 +0000 UTC m=+90.280562036" watchObservedRunningTime="2025-11-29 01:12:27.109912607 +0000 UTC m=+90.282062464" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.175666 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=72.175649277 podStartE2EDuration="1m12.175649277s" podCreationTimestamp="2025-11-29 01:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.173600606 +0000 UTC m=+90.345750463" watchObservedRunningTime="2025-11-29 01:12:27.175649277 +0000 UTC m=+90.347799134" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.186938 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.186920926 podStartE2EDuration="38.186920926s" podCreationTimestamp="2025-11-29 01:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.186631739 +0000 UTC m=+90.358781606" watchObservedRunningTime="2025-11-29 01:12:27.186920926 +0000 UTC m=+90.359070783" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.212347 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2gf7g" podStartSLOduration=67.212326077 podStartE2EDuration="1m7.212326077s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.20077981 +0000 UTC m=+90.372929667" watchObservedRunningTime="2025-11-29 01:12:27.212326077 +0000 UTC m=+90.384475934" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.212656 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-75zrr" podStartSLOduration=67.212648405 podStartE2EDuration="1m7.212648405s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.212609594 +0000 UTC m=+90.384759451" watchObservedRunningTime="2025-11-29 01:12:27.212648405 +0000 UTC m=+90.384798262" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.213894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.213916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.213926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.213940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.213951 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:27Z","lastTransitionTime":"2025-11-29T01:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.243663 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wj9gn" podStartSLOduration=67.243641653 podStartE2EDuration="1m7.243641653s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.224597251 +0000 UTC m=+90.396747118" watchObservedRunningTime="2025-11-29 01:12:27.243641653 +0000 UTC m=+90.415791510" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.260102 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.260083811 podStartE2EDuration="1m12.260083811s" podCreationTimestamp="2025-11-29 01:11:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.244523695 +0000 UTC m=+90.416673562" watchObservedRunningTime="2025-11-29 01:12:27.260083811 +0000 UTC m=+90.432233668" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.297371 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wz6xx" podStartSLOduration=67.297351755 podStartE2EDuration="1m7.297351755s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.2878648 +0000 UTC m=+90.460014667" watchObservedRunningTime="2025-11-29 01:12:27.297351755 +0000 UTC m=+90.469501602" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.297688 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kr9qp" podStartSLOduration=67.297683023 podStartE2EDuration="1m7.297683023s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.29635153 +0000 UTC m=+90.468501397" watchObservedRunningTime="2025-11-29 01:12:27.297683023 +0000 UTC m=+90.469832880" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.316094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.316144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.316153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.316169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.316178 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:27Z","lastTransitionTime":"2025-11-29T01:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.361047 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podStartSLOduration=67.361022194 podStartE2EDuration="1m7.361022194s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:27.360984063 +0000 UTC m=+90.533133920" watchObservedRunningTime="2025-11-29 01:12:27.361022194 +0000 UTC m=+90.533172071" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.418799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.418837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.418846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.418862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.418872 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:27Z","lastTransitionTime":"2025-11-29T01:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.439587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.439630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.439643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.439660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.439673 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-29T01:12:27Z","lastTransitionTime":"2025-11-29T01:12:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.485573 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr"] Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.486048 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.488385 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.488452 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.488518 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.489526 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.509026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.509084 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.509110 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.509137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.509170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.609934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.609979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.609999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.610019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.610042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.610060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.610119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.610799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.615821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.627380 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m6dxr\" (UID: \"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: I1129 01:12:27.798433 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" Nov 29 01:12:27 crc kubenswrapper[4749]: W1129 01:12:27.812525 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ea3ae8_3f0b_4002_ae0e_c8733b119c3e.slice/crio-e38b94b4deb81ddec293190871186ea99183bdebfd110b0f3d56288d95ef267c WatchSource:0}: Error finding container e38b94b4deb81ddec293190871186ea99183bdebfd110b0f3d56288d95ef267c: Status 404 returned error can't find the container with id e38b94b4deb81ddec293190871186ea99183bdebfd110b0f3d56288d95ef267c Nov 29 01:12:28 crc kubenswrapper[4749]: I1129 01:12:28.075029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:28 crc kubenswrapper[4749]: E1129 01:12:28.075335 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:28 crc kubenswrapper[4749]: I1129 01:12:28.614140 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" event={"ID":"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e","Type":"ContainerStarted","Data":"d4f77f803acba3ddbf9433fc2dd350fca0f81e97ce698dcb309607508072ab0a"} Nov 29 01:12:28 crc kubenswrapper[4749]: I1129 01:12:28.614217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" event={"ID":"b1ea3ae8-3f0b-4002-ae0e-c8733b119c3e","Type":"ContainerStarted","Data":"e38b94b4deb81ddec293190871186ea99183bdebfd110b0f3d56288d95ef267c"} Nov 29 01:12:28 crc kubenswrapper[4749]: I1129 01:12:28.632515 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m6dxr" podStartSLOduration=68.632487276 podStartE2EDuration="1m8.632487276s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:28.632124047 +0000 UTC m=+91.804273934" watchObservedRunningTime="2025-11-29 01:12:28.632487276 +0000 UTC m=+91.804637163" Nov 29 01:12:29 crc kubenswrapper[4749]: I1129 01:12:29.074605 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:29 crc kubenswrapper[4749]: I1129 01:12:29.074735 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:29 crc kubenswrapper[4749]: E1129 01:12:29.074830 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:29 crc kubenswrapper[4749]: I1129 01:12:29.074858 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:29 crc kubenswrapper[4749]: E1129 01:12:29.075043 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:29 crc kubenswrapper[4749]: E1129 01:12:29.075343 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:30 crc kubenswrapper[4749]: I1129 01:12:30.074638 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:30 crc kubenswrapper[4749]: E1129 01:12:30.074955 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:30 crc kubenswrapper[4749]: I1129 01:12:30.092074 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 29 01:12:31 crc kubenswrapper[4749]: I1129 01:12:31.074355 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:31 crc kubenswrapper[4749]: I1129 01:12:31.074529 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:31 crc kubenswrapper[4749]: E1129 01:12:31.074570 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:31 crc kubenswrapper[4749]: E1129 01:12:31.074801 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:31 crc kubenswrapper[4749]: I1129 01:12:31.076060 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:31 crc kubenswrapper[4749]: E1129 01:12:31.076526 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:32 crc kubenswrapper[4749]: I1129 01:12:32.074531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:32 crc kubenswrapper[4749]: E1129 01:12:32.074681 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:33 crc kubenswrapper[4749]: I1129 01:12:33.074987 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:33 crc kubenswrapper[4749]: I1129 01:12:33.075101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:33 crc kubenswrapper[4749]: I1129 01:12:33.074997 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:33 crc kubenswrapper[4749]: E1129 01:12:33.075320 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:33 crc kubenswrapper[4749]: E1129 01:12:33.075475 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:33 crc kubenswrapper[4749]: E1129 01:12:33.075622 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:34 crc kubenswrapper[4749]: I1129 01:12:34.074392 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:34 crc kubenswrapper[4749]: E1129 01:12:34.074616 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:35 crc kubenswrapper[4749]: I1129 01:12:35.074010 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:35 crc kubenswrapper[4749]: I1129 01:12:35.074117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:35 crc kubenswrapper[4749]: E1129 01:12:35.074243 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:35 crc kubenswrapper[4749]: E1129 01:12:35.074487 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:35 crc kubenswrapper[4749]: I1129 01:12:35.074753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:35 crc kubenswrapper[4749]: E1129 01:12:35.074855 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:36 crc kubenswrapper[4749]: I1129 01:12:36.074699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:36 crc kubenswrapper[4749]: E1129 01:12:36.076438 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:37 crc kubenswrapper[4749]: I1129 01:12:37.075084 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:37 crc kubenswrapper[4749]: I1129 01:12:37.075139 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:37 crc kubenswrapper[4749]: I1129 01:12:37.075298 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:37 crc kubenswrapper[4749]: E1129 01:12:37.077767 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:37 crc kubenswrapper[4749]: E1129 01:12:37.077961 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:37 crc kubenswrapper[4749]: E1129 01:12:37.078100 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:38 crc kubenswrapper[4749]: I1129 01:12:38.074228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:38 crc kubenswrapper[4749]: E1129 01:12:38.074365 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:38 crc kubenswrapper[4749]: I1129 01:12:38.732294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:38 crc kubenswrapper[4749]: E1129 01:12:38.732474 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:12:38 crc kubenswrapper[4749]: E1129 01:12:38.732568 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs podName:2bba1226-0e27-4cea-9eaa-d653f2061ec1 nodeName:}" failed. No retries permitted until 2025-11-29 01:13:42.732545335 +0000 UTC m=+165.904695222 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs") pod "network-metrics-daemon-nczdn" (UID: "2bba1226-0e27-4cea-9eaa-d653f2061ec1") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 29 01:12:39 crc kubenswrapper[4749]: I1129 01:12:39.075903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:39 crc kubenswrapper[4749]: E1129 01:12:39.076538 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:39 crc kubenswrapper[4749]: I1129 01:12:39.076580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:39 crc kubenswrapper[4749]: E1129 01:12:39.077086 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:39 crc kubenswrapper[4749]: I1129 01:12:39.078329 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:12:39 crc kubenswrapper[4749]: E1129 01:12:39.078637 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-m7sg4_openshift-ovn-kubernetes(52d1a95a-c900-4842-82c4-5f4c37a16fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" Nov 29 01:12:39 crc kubenswrapper[4749]: I1129 01:12:39.078651 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:39 crc kubenswrapper[4749]: E1129 01:12:39.078878 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:40 crc kubenswrapper[4749]: I1129 01:12:40.074265 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:40 crc kubenswrapper[4749]: E1129 01:12:40.074452 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:41 crc kubenswrapper[4749]: I1129 01:12:41.074839 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:41 crc kubenswrapper[4749]: I1129 01:12:41.075002 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:41 crc kubenswrapper[4749]: I1129 01:12:41.075028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:41 crc kubenswrapper[4749]: E1129 01:12:41.076453 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:41 crc kubenswrapper[4749]: E1129 01:12:41.076754 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:41 crc kubenswrapper[4749]: E1129 01:12:41.076991 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:42 crc kubenswrapper[4749]: I1129 01:12:42.074951 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:42 crc kubenswrapper[4749]: E1129 01:12:42.075288 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:43 crc kubenswrapper[4749]: I1129 01:12:43.074223 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:43 crc kubenswrapper[4749]: I1129 01:12:43.074279 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:43 crc kubenswrapper[4749]: I1129 01:12:43.074248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:43 crc kubenswrapper[4749]: E1129 01:12:43.074397 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:43 crc kubenswrapper[4749]: E1129 01:12:43.074434 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:43 crc kubenswrapper[4749]: E1129 01:12:43.074505 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:44 crc kubenswrapper[4749]: I1129 01:12:44.074260 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:44 crc kubenswrapper[4749]: E1129 01:12:44.074463 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:45 crc kubenswrapper[4749]: I1129 01:12:45.075030 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:45 crc kubenswrapper[4749]: I1129 01:12:45.075054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:45 crc kubenswrapper[4749]: I1129 01:12:45.076720 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:45 crc kubenswrapper[4749]: E1129 01:12:45.076960 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:45 crc kubenswrapper[4749]: E1129 01:12:45.077097 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:45 crc kubenswrapper[4749]: E1129 01:12:45.077263 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:46 crc kubenswrapper[4749]: I1129 01:12:46.074278 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:46 crc kubenswrapper[4749]: E1129 01:12:46.074480 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:47 crc kubenswrapper[4749]: I1129 01:12:47.074476 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:47 crc kubenswrapper[4749]: I1129 01:12:47.074580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:47 crc kubenswrapper[4749]: E1129 01:12:47.074635 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:47 crc kubenswrapper[4749]: I1129 01:12:47.074828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:47 crc kubenswrapper[4749]: E1129 01:12:47.074992 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:47 crc kubenswrapper[4749]: E1129 01:12:47.075177 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:48 crc kubenswrapper[4749]: I1129 01:12:48.074408 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:48 crc kubenswrapper[4749]: E1129 01:12:48.074616 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:49 crc kubenswrapper[4749]: I1129 01:12:49.074964 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:49 crc kubenswrapper[4749]: E1129 01:12:49.075308 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:49 crc kubenswrapper[4749]: I1129 01:12:49.074989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:49 crc kubenswrapper[4749]: I1129 01:12:49.076355 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:49 crc kubenswrapper[4749]: E1129 01:12:49.076618 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:49 crc kubenswrapper[4749]: E1129 01:12:49.083015 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:50 crc kubenswrapper[4749]: I1129 01:12:50.074570 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:50 crc kubenswrapper[4749]: E1129 01:12:50.074935 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:51 crc kubenswrapper[4749]: I1129 01:12:51.075537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:51 crc kubenswrapper[4749]: I1129 01:12:51.075595 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:51 crc kubenswrapper[4749]: I1129 01:12:51.075680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:51 crc kubenswrapper[4749]: E1129 01:12:51.075761 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:51 crc kubenswrapper[4749]: E1129 01:12:51.075915 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:51 crc kubenswrapper[4749]: E1129 01:12:51.076121 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:52 crc kubenswrapper[4749]: I1129 01:12:52.074031 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:52 crc kubenswrapper[4749]: E1129 01:12:52.074377 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.074604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.074815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.075456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:53 crc kubenswrapper[4749]: E1129 01:12:53.075742 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:53 crc kubenswrapper[4749]: E1129 01:12:53.076139 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:53 crc kubenswrapper[4749]: E1129 01:12:53.076587 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.077891 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.709866 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/3.log" Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.712979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerStarted","Data":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.713628 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.749591 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podStartSLOduration=93.749563944 podStartE2EDuration="1m33.749563944s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:53.747600665 +0000 UTC m=+116.919750592" watchObservedRunningTime="2025-11-29 01:12:53.749563944 +0000 UTC m=+116.921713861" Nov 29 01:12:53 crc kubenswrapper[4749]: I1129 01:12:53.750071 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.750063206 podStartE2EDuration="23.750063206s" podCreationTimestamp="2025-11-29 01:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:12:37.098409128 +0000 UTC m=+100.270559005" watchObservedRunningTime="2025-11-29 01:12:53.750063206 +0000 UTC m=+116.922213093" Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.074304 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:54 crc kubenswrapper[4749]: E1129 01:12:54.074558 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.123644 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nczdn"] Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.718731 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/1.log" Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.719769 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/0.log" Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.719819 4749 generic.go:334] "Generic (PLEG): container finished" podID="454ec33e-9530-4cf0-ad08-9c3a21b0e56b" containerID="3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5" exitCode=1 Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.719908 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.719963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7g" event={"ID":"454ec33e-9530-4cf0-ad08-9c3a21b0e56b","Type":"ContainerDied","Data":"3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5"} Nov 29 01:12:54 crc kubenswrapper[4749]: E1129 01:12:54.720013 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.720055 4749 scope.go:117] "RemoveContainer" containerID="57781b05307cfe3dbe28bc4d42041c46bd95db41be42e893bc0d0f792a215583" Nov 29 01:12:54 crc kubenswrapper[4749]: I1129 01:12:54.720496 4749 scope.go:117] "RemoveContainer" containerID="3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5" Nov 29 01:12:54 crc kubenswrapper[4749]: E1129 01:12:54.720683 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-2gf7g_openshift-multus(454ec33e-9530-4cf0-ad08-9c3a21b0e56b)\"" pod="openshift-multus/multus-2gf7g" podUID="454ec33e-9530-4cf0-ad08-9c3a21b0e56b" Nov 29 01:12:55 crc kubenswrapper[4749]: I1129 01:12:55.074716 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:55 crc kubenswrapper[4749]: I1129 01:12:55.074815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:55 crc kubenswrapper[4749]: E1129 01:12:55.074879 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:55 crc kubenswrapper[4749]: E1129 01:12:55.074979 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:55 crc kubenswrapper[4749]: I1129 01:12:55.075112 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:55 crc kubenswrapper[4749]: E1129 01:12:55.075240 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:55 crc kubenswrapper[4749]: I1129 01:12:55.725606 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/1.log" Nov 29 01:12:56 crc kubenswrapper[4749]: I1129 01:12:56.074942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:56 crc kubenswrapper[4749]: E1129 01:12:56.075084 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:57 crc kubenswrapper[4749]: E1129 01:12:57.039967 4749 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 29 01:12:57 crc kubenswrapper[4749]: I1129 01:12:57.074469 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:57 crc kubenswrapper[4749]: I1129 01:12:57.074614 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:57 crc kubenswrapper[4749]: I1129 01:12:57.074630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:57 crc kubenswrapper[4749]: E1129 01:12:57.076969 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:57 crc kubenswrapper[4749]: E1129 01:12:57.077150 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:57 crc kubenswrapper[4749]: E1129 01:12:57.077391 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:12:57 crc kubenswrapper[4749]: E1129 01:12:57.169646 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 01:12:58 crc kubenswrapper[4749]: I1129 01:12:58.074846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:12:58 crc kubenswrapper[4749]: E1129 01:12:58.075055 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:12:59 crc kubenswrapper[4749]: I1129 01:12:59.074922 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:12:59 crc kubenswrapper[4749]: E1129 01:12:59.075238 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:12:59 crc kubenswrapper[4749]: I1129 01:12:59.075399 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:12:59 crc kubenswrapper[4749]: I1129 01:12:59.075442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:12:59 crc kubenswrapper[4749]: E1129 01:12:59.075538 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:12:59 crc kubenswrapper[4749]: E1129 01:12:59.075668 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:13:00 crc kubenswrapper[4749]: I1129 01:13:00.074025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:00 crc kubenswrapper[4749]: E1129 01:13:00.074310 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:13:01 crc kubenswrapper[4749]: I1129 01:13:01.074953 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:01 crc kubenswrapper[4749]: I1129 01:13:01.075082 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:01 crc kubenswrapper[4749]: E1129 01:13:01.075147 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:13:01 crc kubenswrapper[4749]: E1129 01:13:01.075331 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:13:01 crc kubenswrapper[4749]: I1129 01:13:01.075456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:01 crc kubenswrapper[4749]: E1129 01:13:01.075557 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:13:02 crc kubenswrapper[4749]: I1129 01:13:02.074441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:02 crc kubenswrapper[4749]: E1129 01:13:02.074919 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:13:02 crc kubenswrapper[4749]: E1129 01:13:02.171174 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 01:13:03 crc kubenswrapper[4749]: I1129 01:13:03.074748 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:03 crc kubenswrapper[4749]: E1129 01:13:03.074916 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:13:03 crc kubenswrapper[4749]: I1129 01:13:03.074919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:03 crc kubenswrapper[4749]: E1129 01:13:03.075015 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:13:03 crc kubenswrapper[4749]: I1129 01:13:03.075386 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:03 crc kubenswrapper[4749]: E1129 01:13:03.075584 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:13:04 crc kubenswrapper[4749]: I1129 01:13:04.074710 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:04 crc kubenswrapper[4749]: E1129 01:13:04.074862 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:13:05 crc kubenswrapper[4749]: I1129 01:13:05.074373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:05 crc kubenswrapper[4749]: I1129 01:13:05.074442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:05 crc kubenswrapper[4749]: I1129 01:13:05.074393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:05 crc kubenswrapper[4749]: E1129 01:13:05.074638 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:13:05 crc kubenswrapper[4749]: E1129 01:13:05.075429 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:13:05 crc kubenswrapper[4749]: E1129 01:13:05.081940 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:13:06 crc kubenswrapper[4749]: I1129 01:13:06.074882 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:06 crc kubenswrapper[4749]: E1129 01:13:06.075187 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:13:07 crc kubenswrapper[4749]: I1129 01:13:07.074669 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:07 crc kubenswrapper[4749]: I1129 01:13:07.074754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:07 crc kubenswrapper[4749]: E1129 01:13:07.076409 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:13:07 crc kubenswrapper[4749]: I1129 01:13:07.076456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:07 crc kubenswrapper[4749]: E1129 01:13:07.076504 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:13:07 crc kubenswrapper[4749]: E1129 01:13:07.076709 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:13:07 crc kubenswrapper[4749]: E1129 01:13:07.171787 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 01:13:08 crc kubenswrapper[4749]: I1129 01:13:08.074285 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:08 crc kubenswrapper[4749]: E1129 01:13:08.074558 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:13:09 crc kubenswrapper[4749]: I1129 01:13:09.074271 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:09 crc kubenswrapper[4749]: I1129 01:13:09.074472 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:09 crc kubenswrapper[4749]: E1129 01:13:09.074587 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:13:09 crc kubenswrapper[4749]: I1129 01:13:09.074324 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:09 crc kubenswrapper[4749]: E1129 01:13:09.074768 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:13:09 crc kubenswrapper[4749]: E1129 01:13:09.075110 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:13:10 crc kubenswrapper[4749]: I1129 01:13:10.074800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:10 crc kubenswrapper[4749]: E1129 01:13:10.075347 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:13:10 crc kubenswrapper[4749]: I1129 01:13:10.075506 4749 scope.go:117] "RemoveContainer" containerID="3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5" Nov 29 01:13:10 crc kubenswrapper[4749]: I1129 01:13:10.788389 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/1.log" Nov 29 01:13:10 crc kubenswrapper[4749]: I1129 01:13:10.788442 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7g" event={"ID":"454ec33e-9530-4cf0-ad08-9c3a21b0e56b","Type":"ContainerStarted","Data":"ddec8c7d8ebf8d0d087a1bdc6857aeb0504b9501b77508a43197f1f205864a99"} Nov 29 01:13:11 crc kubenswrapper[4749]: I1129 01:13:11.075099 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:11 crc kubenswrapper[4749]: I1129 01:13:11.075185 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:11 crc kubenswrapper[4749]: E1129 01:13:11.075285 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 29 01:13:11 crc kubenswrapper[4749]: I1129 01:13:11.075349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:11 crc kubenswrapper[4749]: E1129 01:13:11.075473 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 29 01:13:11 crc kubenswrapper[4749]: E1129 01:13:11.075585 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 29 01:13:12 crc kubenswrapper[4749]: I1129 01:13:12.074522 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:12 crc kubenswrapper[4749]: E1129 01:13:12.075394 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nczdn" podUID="2bba1226-0e27-4cea-9eaa-d653f2061ec1" Nov 29 01:13:13 crc kubenswrapper[4749]: I1129 01:13:13.075006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:13 crc kubenswrapper[4749]: I1129 01:13:13.075430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:13 crc kubenswrapper[4749]: I1129 01:13:13.075628 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:13 crc kubenswrapper[4749]: I1129 01:13:13.079078 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 29 01:13:13 crc kubenswrapper[4749]: I1129 01:13:13.079417 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 29 01:13:13 crc kubenswrapper[4749]: I1129 01:13:13.080111 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 29 01:13:13 crc kubenswrapper[4749]: I1129 01:13:13.083233 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 29 01:13:14 crc kubenswrapper[4749]: I1129 01:13:14.074855 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:14 crc kubenswrapper[4749]: I1129 01:13:14.078151 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 29 01:13:14 crc kubenswrapper[4749]: I1129 01:13:14.078158 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.313732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.369466 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wl6jq"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.370410 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.371020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.371070 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2mp4r"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.371611 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.371710 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.374868 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fclkv"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.375650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.375712 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.376831 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.377430 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.378035 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.378837 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.378838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.390180 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.390547 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.390682 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.390899 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.391025 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.391271 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.391569 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.391892 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.392336 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.392522 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.392723 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.392924 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.394496 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.395241 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.402596 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.402871 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403120 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403331 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403511 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403653 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403712 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403799 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403332 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403873 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.403950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404022 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404087 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404219 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404234 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404250 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404354 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404390 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404402 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404405 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404826 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404899 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.404972 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.405677 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.406381 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f75rz"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.406854 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.407102 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bjgxm"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.407633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.413140 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.416985 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.426956 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.428928 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.429086 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.429591 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.429656 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.429990 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.443559 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.443847 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.445077 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.445117 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.445079 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.445518 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.445673 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.445765 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.446678 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.446954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.447188 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.447395 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.447486 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wm9jp"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.447710 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.447904 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-klx5g"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.447913 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.448211 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.448639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.448677 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.448730 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.448825 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.448936 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.449010 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.449046 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.449103 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.449153 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.449271 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.449301 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.449458 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.451027 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.454707 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.455505 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8cq9l"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.455919 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkqth"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.456187 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.456457 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.456576 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.456945 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8cq9l" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.460642 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2mp4r"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.464290 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmgqj"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.465795 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.466302 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.466637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.470287 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rlstl"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.471000 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.471448 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wl6jq"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.471558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.472248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.473536 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.473801 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.474106 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.474291 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.474310 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.474512 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.474548 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.474840 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.475413 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.475565 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.475726 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.475883 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.478190 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.479021 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.479144 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.481767 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.496809 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4drv6"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.497627 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.497882 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.498048 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.498267 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.498287 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.498405 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.498549 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.498608 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.498663 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.498732 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.500526 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.503686 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.503910 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.504414 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.504503 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.504738 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.504759 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.504983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.505280 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.505917 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.517859 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.518581 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.518755 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.519426 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.519709 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.520459 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.520832 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.521262 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.521335 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.521378 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.521536 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.521934 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.522558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.522814 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.522832 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.525354 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.526046 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.527811 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.528706 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.528952 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.529289 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.529303 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2d5l2"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.529711 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.530551 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.540079 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.544510 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.544691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f4874dba-b38e-4f76-8b77-4bd2c58596d3-machine-approver-tls\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.544802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-serving-cert\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.544899 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmjbm\" (UniqueName: \"kubernetes.io/projected/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-kube-api-access-gmjbm\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.544991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc6bg\" (UniqueName: \"kubernetes.io/projected/18a54704-5d0d-4caa-b347-2751a525a666-kube-api-access-zc6bg\") pod \"cluster-samples-operator-665b6dd947-kg6tt\" (UID: \"18a54704-5d0d-4caa-b347-2751a525a666\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19644e5-e46c-4286-8f46-5022e2bb45b4-config\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545216 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e05f53f-1275-42b8-8d25-1b6f96be0121-serving-cert\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545323 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a19644e5-e46c-4286-8f46-5022e2bb45b4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4874dba-b38e-4f76-8b77-4bd2c58596d3-auth-proxy-config\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkzvj\" (UniqueName: \"kubernetes.io/projected/f4874dba-b38e-4f76-8b77-4bd2c58596d3-kube-api-access-kkzvj\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a59eb74a-0de0-46e4-8bfd-905c7b538393-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-trusted-ca\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-etcd-client\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-etcd-serving-ca\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.545930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-audit-dir\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-config\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546088 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7ht\" (UniqueName: \"kubernetes.io/projected/e05a47eb-0468-48cb-9e4e-19156acdda3a-kube-api-access-dd7ht\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad210bef-5379-4c64-9f18-61d79338e155-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59eb74a-0de0-46e4-8bfd-905c7b538393-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnjk5\" (UniqueName: \"kubernetes.io/projected/a19644e5-e46c-4286-8f46-5022e2bb45b4-kube-api-access-dnjk5\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-client-ca\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-serving-cert\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546473 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5drpb\" (UniqueName: \"kubernetes.io/projected/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-kube-api-access-5drpb\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-encryption-config\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a19644e5-e46c-4286-8f46-5022e2bb45b4-images\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz6xf\" (UniqueName: \"kubernetes.io/projected/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-kube-api-access-dz6xf\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-client-ca\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546595 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-config\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdpq\" (UniqueName: \"kubernetes.io/projected/ad210bef-5379-4c64-9f18-61d79338e155-kube-api-access-fzdpq\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546634 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4874dba-b38e-4f76-8b77-4bd2c58596d3-config\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546653 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-serving-cert\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-node-pullsecrets\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-serving-cert\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-image-import-ca\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-config\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546794 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-config\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05a47eb-0468-48cb-9e4e-19156acdda3a-serving-cert\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpddk\" (UniqueName: \"kubernetes.io/projected/0e05f53f-1275-42b8-8d25-1b6f96be0121-kube-api-access-qpddk\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-service-ca-bundle\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-audit\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad210bef-5379-4c64-9f18-61d79338e155-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwqdl\" (UniqueName: \"kubernetes.io/projected/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-kube-api-access-xwqdl\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5769h\" (UniqueName: \"kubernetes.io/projected/a59eb74a-0de0-46e4-8bfd-905c7b538393-kube-api-access-5769h\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.546987 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18a54704-5d0d-4caa-b347-2751a525a666-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kg6tt\" (UID: \"18a54704-5d0d-4caa-b347-2751a525a666\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.547007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-config\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.547911 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.556859 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.558395 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.559118 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.559437 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.560901 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47pxv"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.561342 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.571803 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.574524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5t7v2"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.575956 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.576569 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.579297 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.582433 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.585621 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hxg42"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.586742 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.587720 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fclkv"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.593598 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-klx5g"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.593658 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wm9jp"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.594882 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.595791 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.600125 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.607856 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tcjx5"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.612362 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.612501 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.615790 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xxssl"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.616650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xxssl" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.617167 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.617362 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.618560 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.620360 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f75rz"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.621728 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.627082 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.629560 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmgqj"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.631139 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.632261 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.633422 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.633750 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.634784 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.636311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8cq9l"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.638144 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.639400 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rlstl"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.643459 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bjgxm"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.644985 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.646951 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad210bef-5379-4c64-9f18-61d79338e155-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647483 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-config\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647512 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7ht\" (UniqueName: \"kubernetes.io/projected/e05a47eb-0468-48cb-9e4e-19156acdda3a-kube-api-access-dd7ht\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647544 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59eb74a-0de0-46e4-8bfd-905c7b538393-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnjk5\" (UniqueName: \"kubernetes.io/projected/a19644e5-e46c-4286-8f46-5022e2bb45b4-kube-api-access-dnjk5\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmj84\" (UniqueName: \"kubernetes.io/projected/d826b104-632f-4c53-9f01-04a7efdfc3c0-kube-api-access-pmj84\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-client-ca\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c1c95d96-1bbb-4e89-a92c-5df613dc068c-apiservice-cert\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-serving-cert\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5drpb\" (UniqueName: \"kubernetes.io/projected/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-kube-api-access-5drpb\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647683 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-encryption-config\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a19644e5-e46c-4286-8f46-5022e2bb45b4-images\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647732 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz6xf\" (UniqueName: \"kubernetes.io/projected/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-kube-api-access-dz6xf\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-client-ca\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647765 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-config\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdpq\" (UniqueName: \"kubernetes.io/projected/ad210bef-5379-4c64-9f18-61d79338e155-kube-api-access-fzdpq\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647816 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4874dba-b38e-4f76-8b77-4bd2c58596d3-config\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-serving-cert\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ac3e00-8bfb-4b64-9b87-e71416270280-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647868 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d826b104-632f-4c53-9f01-04a7efdfc3c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-node-pullsecrets\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647904 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ccc661b4-9b29-4855-bad5-9973dc692c59-images\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-image-import-ca\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-config\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.647984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-config\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-serving-cert\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q698c\" (UniqueName: \"kubernetes.io/projected/07e6f770-684e-4d69-9d50-f85af092d6bc-kube-api-access-q698c\") pod \"dns-operator-744455d44c-rlstl\" (UID: \"07e6f770-684e-4d69-9d50-f85af092d6bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d826b104-632f-4c53-9f01-04a7efdfc3c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05a47eb-0468-48cb-9e4e-19156acdda3a-serving-cert\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpddk\" (UniqueName: \"kubernetes.io/projected/0e05f53f-1275-42b8-8d25-1b6f96be0121-kube-api-access-qpddk\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c1c95d96-1bbb-4e89-a92c-5df613dc068c-webhook-cert\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ac3e00-8bfb-4b64-9b87-e71416270280-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-service-ca-bundle\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-audit\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad210bef-5379-4c64-9f18-61d79338e155-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07e6f770-684e-4d69-9d50-f85af092d6bc-metrics-tls\") pod \"dns-operator-744455d44c-rlstl\" (UID: \"07e6f770-684e-4d69-9d50-f85af092d6bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648236 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwqdl\" (UniqueName: \"kubernetes.io/projected/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-kube-api-access-xwqdl\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccc661b4-9b29-4855-bad5-9973dc692c59-proxy-tls\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh8lk\" (UniqueName: \"kubernetes.io/projected/c1c95d96-1bbb-4e89-a92c-5df613dc068c-kube-api-access-bh8lk\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648293 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648314 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5769h\" (UniqueName: \"kubernetes.io/projected/a59eb74a-0de0-46e4-8bfd-905c7b538393-kube-api-access-5769h\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18a54704-5d0d-4caa-b347-2751a525a666-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kg6tt\" (UID: \"18a54704-5d0d-4caa-b347-2751a525a666\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648350 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-config\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnqpk\" (UniqueName: \"kubernetes.io/projected/ceffc235-a012-44b5-91ff-f45a19502453-kube-api-access-cnqpk\") pod \"downloads-7954f5f757-8cq9l\" (UID: \"ceffc235-a012-44b5-91ff-f45a19502453\") " pod="openshift-console/downloads-7954f5f757-8cq9l" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d15ea300-3b1a-46ab-b775-d771839bb2ee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2d5l2\" (UID: \"d15ea300-3b1a-46ab-b775-d771839bb2ee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f4874dba-b38e-4f76-8b77-4bd2c58596d3-machine-approver-tls\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d826b104-632f-4c53-9f01-04a7efdfc3c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-serving-cert\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc6bg\" (UniqueName: \"kubernetes.io/projected/18a54704-5d0d-4caa-b347-2751a525a666-kube-api-access-zc6bg\") pod \"cluster-samples-operator-665b6dd947-kg6tt\" (UID: \"18a54704-5d0d-4caa-b347-2751a525a666\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmjbm\" (UniqueName: \"kubernetes.io/projected/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-kube-api-access-gmjbm\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19644e5-e46c-4286-8f46-5022e2bb45b4-config\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e05f53f-1275-42b8-8d25-1b6f96be0121-serving-cert\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87k4n\" (UniqueName: \"kubernetes.io/projected/ccc661b4-9b29-4855-bad5-9973dc692c59-kube-api-access-87k4n\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a19644e5-e46c-4286-8f46-5022e2bb45b4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4874dba-b38e-4f76-8b77-4bd2c58596d3-auth-proxy-config\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkzvj\" (UniqueName: \"kubernetes.io/projected/f4874dba-b38e-4f76-8b77-4bd2c58596d3-kube-api-access-kkzvj\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a59eb74a-0de0-46e4-8bfd-905c7b538393-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-trusted-ca\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-etcd-client\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ccc661b4-9b29-4855-bad5-9973dc692c59-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtfqb\" (UniqueName: \"kubernetes.io/projected/98ac3e00-8bfb-4b64-9b87-e71416270280-kube-api-access-wtfqb\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648720 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c1c95d96-1bbb-4e89-a92c-5df613dc068c-tmpfs\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js6zm\" (UniqueName: \"kubernetes.io/projected/d15ea300-3b1a-46ab-b775-d771839bb2ee-kube-api-access-js6zm\") pod \"multus-admission-controller-857f4d67dd-2d5l2\" (UID: \"d15ea300-3b1a-46ab-b775-d771839bb2ee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-etcd-serving-ca\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-audit-dir\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.648840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-audit-dir\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.649134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.649329 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.650101 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4874dba-b38e-4f76-8b77-4bd2c58596d3-config\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.650131 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-config\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.650690 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-audit\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.650720 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a59eb74a-0de0-46e4-8bfd-905c7b538393-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.650895 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.651630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.651844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-config\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.652447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-node-pullsecrets\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.652457 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkqth"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.652542 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-trusted-ca\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.653144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad210bef-5379-4c64-9f18-61d79338e155-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.653185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.653751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4874dba-b38e-4f76-8b77-4bd2c58596d3-auth-proxy-config\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.654088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-image-import-ca\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.654403 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.654419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-client-ca\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.654975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-config\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.655712 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a19644e5-e46c-4286-8f46-5022e2bb45b4-images\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.655851 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.656395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-etcd-serving-ca\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.656521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-client-ca\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.656531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-service-ca-bundle\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.656808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a19644e5-e46c-4286-8f46-5022e2bb45b4-config\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.656969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad210bef-5379-4c64-9f18-61d79338e155-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.657116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-config\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.657334 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e05f53f-1275-42b8-8d25-1b6f96be0121-serving-cert\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.657497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05a47eb-0468-48cb-9e4e-19156acdda3a-serving-cert\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.657852 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-config\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.658158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-serving-cert\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.658294 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hxg42"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.658574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a19644e5-e46c-4286-8f46-5022e2bb45b4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.658654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/f4874dba-b38e-4f76-8b77-4bd2c58596d3-machine-approver-tls\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.659047 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/18a54704-5d0d-4caa-b347-2751a525a666-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kg6tt\" (UID: \"18a54704-5d0d-4caa-b347-2751a525a666\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.659093 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a59eb74a-0de0-46e4-8bfd-905c7b538393-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.659833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-etcd-client\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.660012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47pxv"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.660268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-serving-cert\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.661294 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2d5l2"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.661425 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-serving-cert\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.661562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-serving-cert\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.662021 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-encryption-config\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.662349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4drv6"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.663346 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.664450 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.665472 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-6l4z5"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.666286 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.666465 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tcjx5"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.667487 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xxssl"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.668639 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.679535 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jgbtg"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.680446 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.681022 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.693851 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jgbtg"] Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.695364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.721405 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.734191 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ac3e00-8bfb-4b64-9b87-e71416270280-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d826b104-632f-4c53-9f01-04a7efdfc3c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749429 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ccc661b4-9b29-4855-bad5-9973dc692c59-images\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q698c\" (UniqueName: \"kubernetes.io/projected/07e6f770-684e-4d69-9d50-f85af092d6bc-kube-api-access-q698c\") pod \"dns-operator-744455d44c-rlstl\" (UID: \"07e6f770-684e-4d69-9d50-f85af092d6bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d826b104-632f-4c53-9f01-04a7efdfc3c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c1c95d96-1bbb-4e89-a92c-5df613dc068c-webhook-cert\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ac3e00-8bfb-4b64-9b87-e71416270280-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07e6f770-684e-4d69-9d50-f85af092d6bc-metrics-tls\") pod \"dns-operator-744455d44c-rlstl\" (UID: \"07e6f770-684e-4d69-9d50-f85af092d6bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccc661b4-9b29-4855-bad5-9973dc692c59-proxy-tls\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh8lk\" (UniqueName: \"kubernetes.io/projected/c1c95d96-1bbb-4e89-a92c-5df613dc068c-kube-api-access-bh8lk\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749613 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnqpk\" (UniqueName: \"kubernetes.io/projected/ceffc235-a012-44b5-91ff-f45a19502453-kube-api-access-cnqpk\") pod \"downloads-7954f5f757-8cq9l\" (UID: \"ceffc235-a012-44b5-91ff-f45a19502453\") " pod="openshift-console/downloads-7954f5f757-8cq9l" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749632 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d15ea300-3b1a-46ab-b775-d771839bb2ee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2d5l2\" (UID: \"d15ea300-3b1a-46ab-b775-d771839bb2ee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d826b104-632f-4c53-9f01-04a7efdfc3c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87k4n\" (UniqueName: \"kubernetes.io/projected/ccc661b4-9b29-4855-bad5-9973dc692c59-kube-api-access-87k4n\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749714 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ccc661b4-9b29-4855-bad5-9973dc692c59-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749744 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtfqb\" (UniqueName: \"kubernetes.io/projected/98ac3e00-8bfb-4b64-9b87-e71416270280-kube-api-access-wtfqb\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749765 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js6zm\" (UniqueName: \"kubernetes.io/projected/d15ea300-3b1a-46ab-b775-d771839bb2ee-kube-api-access-js6zm\") pod \"multus-admission-controller-857f4d67dd-2d5l2\" (UID: \"d15ea300-3b1a-46ab-b775-d771839bb2ee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c1c95d96-1bbb-4e89-a92c-5df613dc068c-tmpfs\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmj84\" (UniqueName: \"kubernetes.io/projected/d826b104-632f-4c53-9f01-04a7efdfc3c0-kube-api-access-pmj84\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.749856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c1c95d96-1bbb-4e89-a92c-5df613dc068c-apiservice-cert\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.750530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c1c95d96-1bbb-4e89-a92c-5df613dc068c-tmpfs\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.750554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ccc661b4-9b29-4855-bad5-9973dc692c59-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.753987 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.774690 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.794149 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.813987 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.834386 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.854298 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.874925 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.886686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/07e6f770-684e-4d69-9d50-f85af092d6bc-metrics-tls\") pod \"dns-operator-744455d44c-rlstl\" (UID: \"07e6f770-684e-4d69-9d50-f85af092d6bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.894254 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.913552 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.933978 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.940510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ccc661b4-9b29-4855-bad5-9973dc692c59-images\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.954262 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.975041 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.984294 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ccc661b4-9b29-4855-bad5-9973dc692c59-proxy-tls\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:18 crc kubenswrapper[4749]: I1129 01:13:18.994686 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.013854 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.034399 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.055049 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.074703 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.114784 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.133538 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.154981 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.174622 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.193535 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.215749 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.235596 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.254159 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.274761 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.296134 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.304057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c1c95d96-1bbb-4e89-a92c-5df613dc068c-apiservice-cert\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.309604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c1c95d96-1bbb-4e89-a92c-5df613dc068c-webhook-cert\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.315156 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.334622 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.343112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98ac3e00-8bfb-4b64-9b87-e71416270280-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.354287 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.361293 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ac3e00-8bfb-4b64-9b87-e71416270280-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.374163 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.393769 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.414505 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.435284 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.454714 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.474475 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.494487 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.515434 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.532553 4749 request.go:700] Waited for 1.002717595s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/secrets?fieldSelector=metadata.name%3Dkube-controller-manager-operator-serving-cert&limit=500&resourceVersion=0 Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.535416 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.555319 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.575400 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.595831 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.615320 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.625833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d826b104-632f-4c53-9f01-04a7efdfc3c0-metrics-tls\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.634287 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.667049 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.676232 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.678549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d826b104-632f-4c53-9f01-04a7efdfc3c0-trusted-ca\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.694006 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.705030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d15ea300-3b1a-46ab-b775-d771839bb2ee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2d5l2\" (UID: \"d15ea300-3b1a-46ab-b775-d771839bb2ee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.716116 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.754557 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.774974 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.793804 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.814560 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.834014 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.854965 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.889368 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.894324 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.914394 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.935532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.954181 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.975071 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 29 01:13:19 crc kubenswrapper[4749]: I1129 01:13:19.994807 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.014519 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.034712 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.054621 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.074315 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.094182 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.114754 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.134986 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.154843 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.174710 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.194831 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.214016 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.234882 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.253991 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.274631 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.295249 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.343815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdpq\" (UniqueName: \"kubernetes.io/projected/ad210bef-5379-4c64-9f18-61d79338e155-kube-api-access-fzdpq\") pod \"openshift-controller-manager-operator-756b6f6bc6-ttpw6\" (UID: \"ad210bef-5379-4c64-9f18-61d79338e155\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.362786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7ht\" (UniqueName: \"kubernetes.io/projected/e05a47eb-0468-48cb-9e4e-19156acdda3a-kube-api-access-dd7ht\") pod \"controller-manager-879f6c89f-f75rz\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.373101 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnjk5\" (UniqueName: \"kubernetes.io/projected/a19644e5-e46c-4286-8f46-5022e2bb45b4-kube-api-access-dnjk5\") pod \"machine-api-operator-5694c8668f-2mp4r\" (UID: \"a19644e5-e46c-4286-8f46-5022e2bb45b4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.378754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.401411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpddk\" (UniqueName: \"kubernetes.io/projected/0e05f53f-1275-42b8-8d25-1b6f96be0121-kube-api-access-qpddk\") pod \"route-controller-manager-6576b87f9c-wv4pz\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.420457 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.421640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5769h\" (UniqueName: \"kubernetes.io/projected/a59eb74a-0de0-46e4-8bfd-905c7b538393-kube-api-access-5769h\") pod \"openshift-apiserver-operator-796bbdcf4f-xlc5j\" (UID: \"a59eb74a-0de0-46e4-8bfd-905c7b538393\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.438152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkzvj\" (UniqueName: \"kubernetes.io/projected/f4874dba-b38e-4f76-8b77-4bd2c58596d3-kube-api-access-kkzvj\") pod \"machine-approver-56656f9798-hrpj2\" (UID: \"f4874dba-b38e-4f76-8b77-4bd2c58596d3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.454502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwqdl\" (UniqueName: \"kubernetes.io/projected/cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f-kube-api-access-xwqdl\") pod \"console-operator-58897d9998-bjgxm\" (UID: \"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f\") " pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.476148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz6xf\" (UniqueName: \"kubernetes.io/projected/534e3d1d-c71c-44e9-a589-b198ee8fa4e0-kube-api-access-dz6xf\") pod \"apiserver-76f77b778f-wl6jq\" (UID: \"534e3d1d-c71c-44e9-a589-b198ee8fa4e0\") " pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.494992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmjbm\" (UniqueName: \"kubernetes.io/projected/d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4-kube-api-access-gmjbm\") pod \"authentication-operator-69f744f599-fclkv\" (UID: \"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.504840 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.514365 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc6bg\" (UniqueName: \"kubernetes.io/projected/18a54704-5d0d-4caa-b347-2751a525a666-kube-api-access-zc6bg\") pod \"cluster-samples-operator-665b6dd947-kg6tt\" (UID: \"18a54704-5d0d-4caa-b347-2751a525a666\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.532889 4749 request.go:700] Waited for 1.866256346s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.534608 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.536638 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.537398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5drpb\" (UniqueName: \"kubernetes.io/projected/4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8-kube-api-access-5drpb\") pod \"openshift-config-operator-7777fb866f-6qbnn\" (UID: \"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.554587 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.564989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.575170 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.576228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.595373 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.598816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.615237 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.632228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.633311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f75rz"] Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.634318 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.638605 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:20 crc kubenswrapper[4749]: W1129 01:13:20.650273 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode05a47eb_0468_48cb_9e4e_19156acdda3a.slice/crio-167e8adc5871f830d8bc301e28069ee7630e247de465ce2d993e57f792633a9e WatchSource:0}: Error finding container 167e8adc5871f830d8bc301e28069ee7630e247de465ce2d993e57f792633a9e: Status 404 returned error can't find the container with id 167e8adc5871f830d8bc301e28069ee7630e247de465ce2d993e57f792633a9e Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.655388 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.676359 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q698c\" (UniqueName: \"kubernetes.io/projected/07e6f770-684e-4d69-9d50-f85af092d6bc-kube-api-access-q698c\") pod \"dns-operator-744455d44c-rlstl\" (UID: \"07e6f770-684e-4d69-9d50-f85af092d6bc\") " pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.677522 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6"] Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.695560 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d826b104-632f-4c53-9f01-04a7efdfc3c0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.700599 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.719661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh8lk\" (UniqueName: \"kubernetes.io/projected/c1c95d96-1bbb-4e89-a92c-5df613dc068c-kube-api-access-bh8lk\") pod \"packageserver-d55dfcdfc-kjzmw\" (UID: \"c1c95d96-1bbb-4e89-a92c-5df613dc068c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.730687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnqpk\" (UniqueName: \"kubernetes.io/projected/ceffc235-a012-44b5-91ff-f45a19502453-kube-api-access-cnqpk\") pod \"downloads-7954f5f757-8cq9l\" (UID: \"ceffc235-a012-44b5-91ff-f45a19502453\") " pod="openshift-console/downloads-7954f5f757-8cq9l" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.751710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87k4n\" (UniqueName: \"kubernetes.io/projected/ccc661b4-9b29-4855-bad5-9973dc692c59-kube-api-access-87k4n\") pod \"machine-config-operator-74547568cd-vz6bd\" (UID: \"ccc661b4-9b29-4855-bad5-9973dc692c59\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.758303 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8cq9l" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.768382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js6zm\" (UniqueName: \"kubernetes.io/projected/d15ea300-3b1a-46ab-b775-d771839bb2ee-kube-api-access-js6zm\") pod \"multus-admission-controller-857f4d67dd-2d5l2\" (UID: \"d15ea300-3b1a-46ab-b775-d771839bb2ee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.791265 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtfqb\" (UniqueName: \"kubernetes.io/projected/98ac3e00-8bfb-4b64-9b87-e71416270280-kube-api-access-wtfqb\") pod \"kube-storage-version-migrator-operator-b67b599dd-989w8\" (UID: \"98ac3e00-8bfb-4b64-9b87-e71416270280\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.810714 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmj84\" (UniqueName: \"kubernetes.io/projected/d826b104-632f-4c53-9f01-04a7efdfc3c0-kube-api-access-pmj84\") pod \"ingress-operator-5b745b69d9-g6nl9\" (UID: \"d826b104-632f-4c53-9f01-04a7efdfc3c0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.824240 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.832038 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.842368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" event={"ID":"e05a47eb-0468-48cb-9e4e-19156acdda3a","Type":"ContainerStarted","Data":"167e8adc5871f830d8bc301e28069ee7630e247de465ce2d993e57f792633a9e"} Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.852227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" event={"ID":"ad210bef-5379-4c64-9f18-61d79338e155","Type":"ContainerStarted","Data":"f13c18696308c3d7da7cbb52c3a5a13404fa7443b8d4d173480df0f39444323d"} Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.859837 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.863562 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wl6jq"] Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.872038 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.885864 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.885912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-service-ca\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.885935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdd6q\" (UniqueName: \"kubernetes.io/projected/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-kube-api-access-zdd6q\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.885964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0656a177-96a1-410c-ad6b-c059d91f58c0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-encryption-config\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886020 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tswkr\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-kube-api-access-tswkr\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886035 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-oauth-config\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1727981-d091-4cc0-a8f1-a123ee72c1ca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-trusted-ca-bundle\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886135 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-dir\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2a420e5-d254-4e2f-b585-77884028379f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-serving-cert\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886292 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/39f0437d-74c4-4178-84ba-4c76299aa7e8-signing-cabundle\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-bound-sa-token\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886404 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-serving-cert\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2a420e5-d254-4e2f-b585-77884028379f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886450 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-etcd-client\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886581 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cf44185-40f9-4d15-93a0-13604622e06f-config-volume\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6x8w\" (UniqueName: \"kubernetes.io/projected/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-kube-api-access-c6x8w\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gthkq\" (UniqueName: \"kubernetes.io/projected/f1727981-d091-4cc0-a8f1-a123ee72c1ca-kube-api-access-gthkq\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttvj7\" (UniqueName: \"kubernetes.io/projected/2cf44185-40f9-4d15-93a0-13604622e06f-kube-api-access-ttvj7\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2a420e5-d254-4e2f-b585-77884028379f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.886858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttss\" (UniqueName: \"kubernetes.io/projected/39f0437d-74c4-4178-84ba-4c76299aa7e8-kube-api-access-2ttss\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887043 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d3b4d38-c668-479d-9ebc-aea9febe04bc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-trusted-ca\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-config\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887208 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3b4d38-c668-479d-9ebc-aea9febe04bc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887227 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-certificates\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887243 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9rtq\" (UniqueName: \"kubernetes.io/projected/f2486a39-f7ad-443e-b243-4e7c6084e3ac-kube-api-access-d9rtq\") pod \"migrator-59844c95c7-vpqvw\" (UID: \"f2486a39-f7ad-443e-b243-4e7c6084e3ac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-client\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cf44185-40f9-4d15-93a0-13604622e06f-secret-volume\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887332 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887348 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr5v8\" (UniqueName: \"kubernetes.io/projected/a2a420e5-d254-4e2f-b585-77884028379f-kube-api-access-fr5v8\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887609 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-audit-dir\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0656a177-96a1-410c-ad6b-c059d91f58c0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.887671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvgsk\" (UniqueName: \"kubernetes.io/projected/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-kube-api-access-mvgsk\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.888085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-config\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.888106 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.888217 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-audit-policies\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.888244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-ca\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: E1129 01:13:20.890234 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:21.39021987 +0000 UTC m=+144.562369727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.890621 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/39f0437d-74c4-4178-84ba-4c76299aa7e8-signing-key\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.890660 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-service-ca\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.890675 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-oauth-serving-cert\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.890691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3b4d38-c668-479d-9ebc-aea9febe04bc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.890708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0656a177-96a1-410c-ad6b-c059d91f58c0-config\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.892805 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.894279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" event={"ID":"f4874dba-b38e-4f76-8b77-4bd2c58596d3","Type":"ContainerStarted","Data":"e2e52fae4090a1290bb3c08094dbb86b2f32c29b11f5226dacdfc7ba5d94187d"} Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.894597 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-config\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkvqp\" (UniqueName: \"kubernetes.io/projected/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-kube-api-access-tkvqp\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khntg\" (UniqueName: \"kubernetes.io/projected/0b4e1a07-c117-40a3-8451-db2d6a18a98a-kube-api-access-khntg\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895715 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895752 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-tls\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-proxy-tls\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.895876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1727981-d091-4cc0-a8f1-a123ee72c1ca-srv-cert\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.896097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.896142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-policies\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.896186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4e1a07-c117-40a3-8451-db2d6a18a98a-serving-cert\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.896619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.997326 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.997547 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-config\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.997584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3b4d38-c668-479d-9ebc-aea9febe04bc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.997612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/608ac85c-70c4-4426-b036-e80c45b4bc64-certs\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.997634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-certificates\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.997652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9rtq\" (UniqueName: \"kubernetes.io/projected/f2486a39-f7ad-443e-b243-4e7c6084e3ac-kube-api-access-d9rtq\") pod \"migrator-59844c95c7-vpqvw\" (UID: \"f2486a39-f7ad-443e-b243-4e7c6084e3ac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.997671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-client\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/379d4463-1203-41ec-8a7d-f8474a23e7b8-cert\") pod \"ingress-canary-xxssl\" (UID: \"379d4463-1203-41ec-8a7d-f8474a23e7b8\") " pod="openshift-ingress-canary/ingress-canary-xxssl" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cf44185-40f9-4d15-93a0-13604622e06f-secret-volume\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jlhn\" (UniqueName: \"kubernetes.io/projected/8e8ea46b-ee66-4110-a123-dfe393cb4abf-kube-api-access-5jlhn\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998381 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr5v8\" (UniqueName: \"kubernetes.io/projected/a2a420e5-d254-4e2f-b585-77884028379f-kube-api-access-fr5v8\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wz9z\" (UniqueName: \"kubernetes.io/projected/e3bc3d9f-ca91-4ab2-99f2-0558be9adf59-kube-api-access-9wz9z\") pod \"package-server-manager-789f6589d5-kpsn9\" (UID: \"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-audit-dir\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0656a177-96a1-410c-ad6b-c059d91f58c0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvgsk\" (UniqueName: \"kubernetes.io/projected/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-kube-api-access-mvgsk\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-config\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-audit-policies\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998599 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-ca\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v2tl\" (UniqueName: \"kubernetes.io/projected/6ccb5157-d0d7-4fbc-821b-cd249461519b-kube-api-access-8v2tl\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.998661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8e8ea46b-ee66-4110-a123-dfe393cb4abf-srv-cert\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:20 crc kubenswrapper[4749]: E1129 01:13:20.998953 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:21.498920088 +0000 UTC m=+144.671069945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:20 crc kubenswrapper[4749]: I1129 01:13:20.997575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-2mp4r"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:20.999733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-config\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.000283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-audit-policies\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.001188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-ca\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.002425 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-certificates\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.002467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-audit-dir\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.003958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/39f0437d-74c4-4178-84ba-4c76299aa7e8-signing-key\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004687 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-oauth-serving-cert\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-452cg\" (UniqueName: \"kubernetes.io/projected/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-kube-api-access-452cg\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004763 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkszd\" (UniqueName: \"kubernetes.io/projected/608ac85c-70c4-4426-b036-e80c45b4bc64-kube-api-access-jkszd\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-service-ca\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3b4d38-c668-479d-9ebc-aea9febe04bc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004828 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0656a177-96a1-410c-ad6b-c059d91f58c0-config\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fc319cf-59a5-496d-9970-3ce233ceba43-serving-cert\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004882 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3bc3d9f-ca91-4ab2-99f2-0558be9adf59-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kpsn9\" (UID: \"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004931 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc319cf-59a5-496d-9970-3ce233ceba43-config\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.004997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.005012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.005064 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.005083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-config\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.005440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3b4d38-c668-479d-9ebc-aea9febe04bc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.005954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.006556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-service-ca\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.006768 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b4e1a07-c117-40a3-8451-db2d6a18a98a-config\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.008208 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.010724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-config\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.010919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-oauth-serving-cert\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.010981 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b4e1a07-c117-40a3-8451-db2d6a18a98a-etcd-client\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.011021 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.011976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khntg\" (UniqueName: \"kubernetes.io/projected/0b4e1a07-c117-40a3-8451-db2d6a18a98a-kube-api-access-khntg\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ccb5157-d0d7-4fbc-821b-cd249461519b-metrics-tls\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkvqp\" (UniqueName: \"kubernetes.io/projected/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-kube-api-access-tkvqp\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-proxy-tls\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fplk5\" (UniqueName: \"kubernetes.io/projected/d34de2ad-4a60-49b0-b63b-5f610370bbd4-kube-api-access-fplk5\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-tls\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1727981-d091-4cc0-a8f1-a123ee72c1ca-srv-cert\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlqfh\" (UniqueName: \"kubernetes.io/projected/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-kube-api-access-vlqfh\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8e8ea46b-ee66-4110-a123-dfe393cb4abf-profile-collector-cert\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4e1a07-c117-40a3-8451-db2d6a18a98a-serving-cert\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-policies\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012379 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ccb5157-d0d7-4fbc-821b-cd249461519b-config-volume\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdd6q\" (UniqueName: \"kubernetes.io/projected/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-kube-api-access-zdd6q\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-service-ca\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0656a177-96a1-410c-ad6b-c059d91f58c0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cf44185-40f9-4d15-93a0-13604622e06f-secret-volume\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-socket-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-encryption-config\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-metrics-certs\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.012984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-mountpoint-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-csi-data-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013175 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tswkr\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-kube-api-access-tswkr\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1727981-d091-4cc0-a8f1-a123ee72c1ca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013318 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-plugins-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/608ac85c-70c4-4426-b036-e80c45b4bc64-node-bootstrap-token\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lqgq\" (UniqueName: \"kubernetes.io/projected/4fc319cf-59a5-496d-9970-3ce233ceba43-kube-api-access-5lqgq\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-oauth-config\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2a420e5-d254-4e2f-b585-77884028379f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-trusted-ca-bundle\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013572 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-dir\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-serving-cert\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/39f0437d-74c4-4178-84ba-4c76299aa7e8-signing-cabundle\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.013757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-stats-auth\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.014852 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/39f0437d-74c4-4178-84ba-4c76299aa7e8-signing-key\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.015923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.016384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-bound-sa-token\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-serving-cert\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-default-certificate\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2a420e5-d254-4e2f-b585-77884028379f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018587 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dzxv\" (UniqueName: \"kubernetes.io/projected/379d4463-1203-41ec-8a7d-f8474a23e7b8-kube-api-access-2dzxv\") pod \"ingress-canary-xxssl\" (UID: \"379d4463-1203-41ec-8a7d-f8474a23e7b8\") " pod="openshift-ingress-canary/ingress-canary-xxssl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-registration-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-etcd-client\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78e8e60a-3e83-41f4-8e5d-e502d05118ac-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hj2jl\" (UID: \"78e8e60a-3e83-41f4-8e5d-e502d05118ac\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cf44185-40f9-4d15-93a0-13604622e06f-config-volume\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd6vt\" (UniqueName: \"kubernetes.io/projected/78e8e60a-3e83-41f4-8e5d-e502d05118ac-kube-api-access-bd6vt\") pod \"control-plane-machine-set-operator-78cbb6b69f-hj2jl\" (UID: \"78e8e60a-3e83-41f4-8e5d-e502d05118ac\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-service-ca-bundle\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ttss\" (UniqueName: \"kubernetes.io/projected/39f0437d-74c4-4178-84ba-4c76299aa7e8-kube-api-access-2ttss\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6x8w\" (UniqueName: \"kubernetes.io/projected/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-kube-api-access-c6x8w\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gthkq\" (UniqueName: \"kubernetes.io/projected/f1727981-d091-4cc0-a8f1-a123ee72c1ca-kube-api-access-gthkq\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttvj7\" (UniqueName: \"kubernetes.io/projected/2cf44185-40f9-4d15-93a0-13604622e06f-kube-api-access-ttvj7\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.018937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2a420e5-d254-4e2f-b585-77884028379f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.019219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.019277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d3b4d38-c668-479d-9ebc-aea9febe04bc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.019301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-trusted-ca\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.019322 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.019506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.019728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.020435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-dir\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.020622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-trusted-ca\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.020995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-service-ca\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.021885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-policies\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.022133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-encryption-config\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.024910 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/39f0437d-74c4-4178-84ba-4c76299aa7e8-signing-cabundle\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.027760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0656a177-96a1-410c-ad6b-c059d91f58c0-config\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.028602 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.029550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-proxy-tls\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.032962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-etcd-client\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.033163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.035879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b4e1a07-c117-40a3-8451-db2d6a18a98a-serving-cert\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.039656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3b4d38-c668-479d-9ebc-aea9febe04bc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.040129 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-tls\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.040471 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-serving-cert\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.041802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0656a177-96a1-410c-ad6b-c059d91f58c0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.042794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f1727981-d091-4cc0-a8f1-a123ee72c1ca-srv-cert\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.042956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f1727981-d091-4cc0-a8f1-a123ee72c1ca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.056127 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0656a177-96a1-410c-ad6b-c059d91f58c0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-q48hh\" (UID: \"0656a177-96a1-410c-ad6b-c059d91f58c0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.088118 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvgsk\" (UniqueName: \"kubernetes.io/projected/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-kube-api-access-mvgsk\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.103427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9rtq\" (UniqueName: \"kubernetes.io/projected/f2486a39-f7ad-443e-b243-4e7c6084e3ac-kube-api-access-d9rtq\") pod \"migrator-59844c95c7-vpqvw\" (UID: \"f2486a39-f7ad-443e-b243-4e7c6084e3ac\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.119098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b8b9a5db-5c5d-4cd8-be74-52305dfb6665-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wfblw\" (UID: \"b8b9a5db-5c5d-4cd8-be74-52305dfb6665\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120045 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/608ac85c-70c4-4426-b036-e80c45b4bc64-certs\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/379d4463-1203-41ec-8a7d-f8474a23e7b8-cert\") pod \"ingress-canary-xxssl\" (UID: \"379d4463-1203-41ec-8a7d-f8474a23e7b8\") " pod="openshift-ingress-canary/ingress-canary-xxssl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jlhn\" (UniqueName: \"kubernetes.io/projected/8e8ea46b-ee66-4110-a123-dfe393cb4abf-kube-api-access-5jlhn\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120160 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wz9z\" (UniqueName: \"kubernetes.io/projected/e3bc3d9f-ca91-4ab2-99f2-0558be9adf59-kube-api-access-9wz9z\") pod \"package-server-manager-789f6589d5-kpsn9\" (UID: \"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120218 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v2tl\" (UniqueName: \"kubernetes.io/projected/6ccb5157-d0d7-4fbc-821b-cd249461519b-kube-api-access-8v2tl\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120256 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8e8ea46b-ee66-4110-a123-dfe393cb4abf-srv-cert\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120273 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-452cg\" (UniqueName: \"kubernetes.io/projected/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-kube-api-access-452cg\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120305 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkszd\" (UniqueName: \"kubernetes.io/projected/608ac85c-70c4-4426-b036-e80c45b4bc64-kube-api-access-jkszd\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120320 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fc319cf-59a5-496d-9970-3ce233ceba43-serving-cert\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3bc3d9f-ca91-4ab2-99f2-0558be9adf59-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kpsn9\" (UID: \"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc319cf-59a5-496d-9970-3ce233ceba43-config\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ccb5157-d0d7-4fbc-821b-cd249461519b-metrics-tls\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fplk5\" (UniqueName: \"kubernetes.io/projected/d34de2ad-4a60-49b0-b63b-5f610370bbd4-kube-api-access-fplk5\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlqfh\" (UniqueName: \"kubernetes.io/projected/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-kube-api-access-vlqfh\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8e8ea46b-ee66-4110-a123-dfe393cb4abf-profile-collector-cert\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ccb5157-d0d7-4fbc-821b-cd249461519b-config-volume\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-socket-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-metrics-certs\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-mountpoint-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-csi-data-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-plugins-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120613 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/608ac85c-70c4-4426-b036-e80c45b4bc64-node-bootstrap-token\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lqgq\" (UniqueName: \"kubernetes.io/projected/4fc319cf-59a5-496d-9970-3ce233ceba43-kube-api-access-5lqgq\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-stats-auth\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120715 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-default-certificate\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120739 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-registration-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dzxv\" (UniqueName: \"kubernetes.io/projected/379d4463-1203-41ec-8a7d-f8474a23e7b8-kube-api-access-2dzxv\") pod \"ingress-canary-xxssl\" (UID: \"379d4463-1203-41ec-8a7d-f8474a23e7b8\") " pod="openshift-ingress-canary/ingress-canary-xxssl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120774 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78e8e60a-3e83-41f4-8e5d-e502d05118ac-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hj2jl\" (UID: \"78e8e60a-3e83-41f4-8e5d-e502d05118ac\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd6vt\" (UniqueName: \"kubernetes.io/projected/78e8e60a-3e83-41f4-8e5d-e502d05118ac-kube-api-access-bd6vt\") pod \"control-plane-machine-set-operator-78cbb6b69f-hj2jl\" (UID: \"78e8e60a-3e83-41f4-8e5d-e502d05118ac\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.120820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-service-ca-bundle\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.121133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-csi-data-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.121909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.121932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-service-ca-bundle\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.122098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc319cf-59a5-496d-9970-3ce233ceba43-config\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.122716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ccb5157-d0d7-4fbc-821b-cd249461519b-config-volume\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.122978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-plugins-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.123485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-mountpoint-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.123541 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-registration-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.125802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/379d4463-1203-41ec-8a7d-f8474a23e7b8-cert\") pod \"ingress-canary-xxssl\" (UID: \"379d4463-1203-41ec-8a7d-f8474a23e7b8\") " pod="openshift-ingress-canary/ingress-canary-xxssl" Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.126440 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:21.626415631 +0000 UTC m=+144.798565668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.127308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/608ac85c-70c4-4426-b036-e80c45b4bc64-node-bootstrap-token\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.127649 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/608ac85c-70c4-4426-b036-e80c45b4bc64-certs\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.127779 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-metrics-certs\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.127872 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-socket-dir\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.128185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-stats-auth\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.130913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3bc3d9f-ca91-4ab2-99f2-0558be9adf59-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kpsn9\" (UID: \"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.134902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.138012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-default-certificate\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.154075 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bjgxm"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.154115 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.154236 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.182728 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" Nov 29 01:13:21 crc kubenswrapper[4749]: W1129 01:13:21.190416 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc13c12a_1e6f_48a7_b1ba_3bc1a83d027f.slice/crio-8b4beebe0b1b38920407ad0b15df8e29058d86de043b9a5bc2e9b1e590dd6e07 WatchSource:0}: Error finding container 8b4beebe0b1b38920407ad0b15df8e29058d86de043b9a5bc2e9b1e590dd6e07: Status 404 returned error can't find the container with id 8b4beebe0b1b38920407ad0b15df8e29058d86de043b9a5bc2e9b1e590dd6e07 Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.221901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.223362 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:21.723332695 +0000 UTC m=+144.895482552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.232071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tswkr\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-kube-api-access-tswkr\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.239738 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d3b4d38-c668-479d-9ebc-aea9febe04bc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2s7dc\" (UID: \"1d3b4d38-c668-479d-9ebc-aea9febe04bc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.251146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.251331 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.251570 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.253813 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.267758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khntg\" (UniqueName: \"kubernetes.io/projected/0b4e1a07-c117-40a3-8451-db2d6a18a98a-kube-api-access-khntg\") pod \"etcd-operator-b45778765-tmgqj\" (UID: \"0b4e1a07-c117-40a3-8451-db2d6a18a98a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.268003 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cf44185-40f9-4d15-93a0-13604622e06f-config-volume\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.268761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78e8e60a-3e83-41f4-8e5d-e502d05118ac-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-hj2jl\" (UID: \"78e8e60a-3e83-41f4-8e5d-e502d05118ac\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.270350 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ccb5157-d0d7-4fbc-821b-cd249461519b-metrics-tls\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.271899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2879ffd3-fbd8-4da2-9c3d-174c70f418b4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nhn9s\" (UID: \"2879ffd3-fbd8-4da2-9c3d-174c70f418b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.272162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fc319cf-59a5-496d-9970-3ce233ceba43-serving-cert\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.272358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8e8ea46b-ee66-4110-a123-dfe393cb4abf-profile-collector-cert\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.273280 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2a420e5-d254-4e2f-b585-77884028379f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.276490 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-trusted-ca-bundle\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.277650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2a420e5-d254-4e2f-b585-77884028379f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.283403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-bound-sa-token\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.285109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.285422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.285758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.285965 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-serving-cert\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.286175 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8e8ea46b-ee66-4110-a123-dfe393cb4abf-srv-cert\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.286690 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-oauth-config\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.286740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2a420e5-d254-4e2f-b585-77884028379f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.287667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr5v8\" (UniqueName: \"kubernetes.io/projected/a2a420e5-d254-4e2f-b585-77884028379f-kube-api-access-fr5v8\") pod \"cluster-image-registry-operator-dc59b4c8b-gkhzs\" (UID: \"a2a420e5-d254-4e2f-b585-77884028379f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.288148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdd6q\" (UniqueName: \"kubernetes.io/projected/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-kube-api-access-zdd6q\") pod \"console-f9d7485db-klx5g\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.289146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.290815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkvqp\" (UniqueName: \"kubernetes.io/projected/0d0ee540-2ed0-4074-91a7-7498e7d1da8d-kube-api-access-tkvqp\") pod \"apiserver-7bbb656c7d-dhpfn\" (UID: \"0d0ee540-2ed0-4074-91a7-7498e7d1da8d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.292020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ttss\" (UniqueName: \"kubernetes.io/projected/39f0437d-74c4-4178-84ba-4c76299aa7e8-kube-api-access-2ttss\") pod \"service-ca-9c57cc56f-4drv6\" (UID: \"39f0437d-74c4-4178-84ba-4c76299aa7e8\") " pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.292070 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fclkv"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.292750 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.311615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6x8w\" (UniqueName: \"kubernetes.io/projected/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-kube-api-access-c6x8w\") pod \"oauth-openshift-558db77b4-wm9jp\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.324532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.325086 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:21.825065133 +0000 UTC m=+144.997214990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.326960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.333382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gthkq\" (UniqueName: \"kubernetes.io/projected/f1727981-d091-4cc0-a8f1-a123ee72c1ca-kube-api-access-gthkq\") pod \"olm-operator-6b444d44fb-2hwrs\" (UID: \"f1727981-d091-4cc0-a8f1-a123ee72c1ca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.336393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.342614 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.349134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttvj7\" (UniqueName: \"kubernetes.io/projected/2cf44185-40f9-4d15-93a0-13604622e06f-kube-api-access-ttvj7\") pod \"collect-profiles-29406300-2xlmw\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.390128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-452cg\" (UniqueName: \"kubernetes.io/projected/f1015f6f-5ba5-472f-8a50-4b16bfdbf016-kube-api-access-452cg\") pod \"router-default-5444994796-5t7v2\" (UID: \"f1015f6f-5ba5-472f-8a50-4b16bfdbf016\") " pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.392652 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.401407 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.410102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkszd\" (UniqueName: \"kubernetes.io/projected/608ac85c-70c4-4426-b036-e80c45b4bc64-kube-api-access-jkszd\") pod \"machine-config-server-6l4z5\" (UID: \"608ac85c-70c4-4426-b036-e80c45b4bc64\") " pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.411502 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.425533 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.425924 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:21.925906105 +0000 UTC m=+145.098055962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.430607 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.433385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fplk5\" (UniqueName: \"kubernetes.io/projected/d34de2ad-4a60-49b0-b63b-5f610370bbd4-kube-api-access-fplk5\") pod \"marketplace-operator-79b997595-47pxv\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.437148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.445444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.450130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlqfh\" (UniqueName: \"kubernetes.io/projected/8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb-kube-api-access-vlqfh\") pod \"csi-hostpathplugin-tcjx5\" (UID: \"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb\") " pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.468336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lqgq\" (UniqueName: \"kubernetes.io/projected/4fc319cf-59a5-496d-9970-3ce233ceba43-kube-api-access-5lqgq\") pod \"service-ca-operator-777779d784-hxg42\" (UID: \"4fc319cf-59a5-496d-9970-3ce233ceba43\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.473266 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.512409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jlhn\" (UniqueName: \"kubernetes.io/projected/8e8ea46b-ee66-4110-a123-dfe393cb4abf-kube-api-access-5jlhn\") pod \"catalog-operator-68c6474976-5pbzn\" (UID: \"8e8ea46b-ee66-4110-a123-dfe393cb4abf\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.519180 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.528089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.528500 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.028487959 +0000 UTC m=+145.200637816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.528607 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.536865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8cq9l"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.540533 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.543808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dzxv\" (UniqueName: \"kubernetes.io/projected/379d4463-1203-41ec-8a7d-f8474a23e7b8-kube-api-access-2dzxv\") pod \"ingress-canary-xxssl\" (UID: \"379d4463-1203-41ec-8a7d-f8474a23e7b8\") " pod="openshift-ingress-canary/ingress-canary-xxssl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.546004 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.551669 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-rlstl"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.556582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v2tl\" (UniqueName: \"kubernetes.io/projected/6ccb5157-d0d7-4fbc-821b-cd249461519b-kube-api-access-8v2tl\") pod \"dns-default-jgbtg\" (UID: \"6ccb5157-d0d7-4fbc-821b-cd249461519b\") " pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.559985 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.565816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.576465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xxssl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.581702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd6vt\" (UniqueName: \"kubernetes.io/projected/78e8e60a-3e83-41f4-8e5d-e502d05118ac-kube-api-access-bd6vt\") pod \"control-plane-machine-set-operator-78cbb6b69f-hj2jl\" (UID: \"78e8e60a-3e83-41f4-8e5d-e502d05118ac\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.582126 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-6l4z5" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.595869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.596478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2d5l2"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.600413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.606847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wz9z\" (UniqueName: \"kubernetes.io/projected/e3bc3d9f-ca91-4ab2-99f2-0558be9adf59-kube-api-access-9wz9z\") pod \"package-server-manager-789f6589d5-kpsn9\" (UID: \"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.628725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.629041 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.129026402 +0000 UTC m=+145.301176259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.629260 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw"] Nov 29 01:13:21 crc kubenswrapper[4749]: W1129 01:13:21.685320 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd826b104_632f_4c53_9f01_04a7efdfc3c0.slice/crio-9ca6a240889909e071491cac3bb1576dffd1284b4d7854011a21979774c7ff71 WatchSource:0}: Error finding container 9ca6a240889909e071491cac3bb1576dffd1284b4d7854011a21979774c7ff71: Status 404 returned error can't find the container with id 9ca6a240889909e071491cac3bb1576dffd1284b4d7854011a21979774c7ff71 Nov 29 01:13:21 crc kubenswrapper[4749]: W1129 01:13:21.687670 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e6f770_684e_4d69_9d50_f85af092d6bc.slice/crio-655eb8f95ac013ebf8b0ed1a14677fd4d255c0bfb1f6edc0d0641a42f47fa24e WatchSource:0}: Error finding container 655eb8f95ac013ebf8b0ed1a14677fd4d255c0bfb1f6edc0d0641a42f47fa24e: Status 404 returned error can't find the container with id 655eb8f95ac013ebf8b0ed1a14677fd4d255c0bfb1f6edc0d0641a42f47fa24e Nov 29 01:13:21 crc kubenswrapper[4749]: W1129 01:13:21.689618 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c95d96_1bbb_4e89_a92c_5df613dc068c.slice/crio-c7f217768698859ef57b150301225309ce9d6badb3145cc1561bc045227520f5 WatchSource:0}: Error finding container c7f217768698859ef57b150301225309ce9d6badb3145cc1561bc045227520f5: Status 404 returned error can't find the container with id c7f217768698859ef57b150301225309ce9d6badb3145cc1561bc045227520f5 Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.729804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.730084 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.23007361 +0000 UTC m=+145.402223457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: W1129 01:13:21.746871 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b9a5db_5c5d_4cd8_be74_52305dfb6665.slice/crio-cf520be829b643001837233674d086cb9d53608f1a872f248ab2e990ca7a9280 WatchSource:0}: Error finding container cf520be829b643001837233674d086cb9d53608f1a872f248ab2e990ca7a9280: Status 404 returned error can't find the container with id cf520be829b643001837233674d086cb9d53608f1a872f248ab2e990ca7a9280 Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.806127 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.811842 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.848114 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.848517 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.848815 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.348791229 +0000 UTC m=+145.520941086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.848987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.852600 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.352586493 +0000 UTC m=+145.524736350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.891521 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw"] Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.896694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" event={"ID":"ccc661b4-9b29-4855-bad5-9973dc692c59","Type":"ContainerStarted","Data":"7d931d6c854b5ccdc6dc39e2bba11b595057a47dcd219fe4a3aa6f80e859418b"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.899992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bjgxm" event={"ID":"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f","Type":"ContainerStarted","Data":"8b4beebe0b1b38920407ad0b15df8e29058d86de043b9a5bc2e9b1e590dd6e07"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.901312 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" event={"ID":"98ac3e00-8bfb-4b64-9b87-e71416270280","Type":"ContainerStarted","Data":"7e6ab2e1558c6b1b3db52613a963d6cb41f1d1e3dc4cb21a49ee849e340282fc"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.905154 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" event={"ID":"a59eb74a-0de0-46e4-8bfd-905c7b538393","Type":"ContainerStarted","Data":"e160013c1ad4bee9ea29958ee7d79babd52e918d2acb537e3d29e06bb74ede5b"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.905176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" event={"ID":"a59eb74a-0de0-46e4-8bfd-905c7b538393","Type":"ContainerStarted","Data":"e8261cbe95b735a31b738c0b8872cac87362eee211a50f7cd46d129346a0e87c"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.909607 4749 generic.go:334] "Generic (PLEG): container finished" podID="4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8" containerID="3dcc109bfa11eea6047638e9fec8451294afa161aa109926d12dd003aa5efb51" exitCode=0 Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.910392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" event={"ID":"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8","Type":"ContainerDied","Data":"3dcc109bfa11eea6047638e9fec8451294afa161aa109926d12dd003aa5efb51"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.910418 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" event={"ID":"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8","Type":"ContainerStarted","Data":"fc328250a7692d9fac6e118cbf345658368501ae15adfeba35feb9f7465dfde3"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.911834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" event={"ID":"a19644e5-e46c-4286-8f46-5022e2bb45b4","Type":"ContainerStarted","Data":"8159b73bc3e590dfa4ae1d1f7bdc1fb27dd39d56d257459d0b90de21ef698b3f"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.911854 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" event={"ID":"a19644e5-e46c-4286-8f46-5022e2bb45b4","Type":"ContainerStarted","Data":"d37f3ca41c492b091dad40fc7ed1f89e85fa43026923780a379baf74b7793b0a"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.912862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" event={"ID":"e05a47eb-0468-48cb-9e4e-19156acdda3a","Type":"ContainerStarted","Data":"625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.913526 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.914534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" event={"ID":"b8b9a5db-5c5d-4cd8-be74-52305dfb6665","Type":"ContainerStarted","Data":"cf520be829b643001837233674d086cb9d53608f1a872f248ab2e990ca7a9280"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.914743 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-f75rz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.914769 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" podUID="e05a47eb-0468-48cb-9e4e-19156acdda3a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.915639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" event={"ID":"c1c95d96-1bbb-4e89-a92c-5df613dc068c","Type":"ContainerStarted","Data":"c7f217768698859ef57b150301225309ce9d6badb3145cc1561bc045227520f5"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.916403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" event={"ID":"0e05f53f-1275-42b8-8d25-1b6f96be0121","Type":"ContainerStarted","Data":"a61ae5cd38027aada6ef17a75c139f6f725bed3e2b865f8e6ca6f4c31ebe1d8a"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.916935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8cq9l" event={"ID":"ceffc235-a012-44b5-91ff-f45a19502453","Type":"ContainerStarted","Data":"f4969e52e6d587b4d1c501f9edf00aa4c61c226c4c1ee5856b4a1a3f8a85bdd8"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.928445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" event={"ID":"ad210bef-5379-4c64-9f18-61d79338e155","Type":"ContainerStarted","Data":"d45b9e35ca4c1af392011b520150c13ea2f1ac1f2bd99267da19b92338e6b130"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.930527 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" event={"ID":"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4","Type":"ContainerStarted","Data":"b7e4c1c662fc2b24761ddbd8cb71759126b8acfac06caaaed747ba72e65b953f"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.932118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" event={"ID":"f4874dba-b38e-4f76-8b77-4bd2c58596d3","Type":"ContainerStarted","Data":"e426b664aa87789c1371f9e69115f8a394638e90f9876137f8f0f0dac41d6070"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.946987 4749 generic.go:334] "Generic (PLEG): container finished" podID="534e3d1d-c71c-44e9-a589-b198ee8fa4e0" containerID="e7fac10ad00acfdbc7330f13b9303ec3d2475c7ece62c0d6b6eb368d758ab978" exitCode=0 Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.947131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" event={"ID":"534e3d1d-c71c-44e9-a589-b198ee8fa4e0","Type":"ContainerDied","Data":"e7fac10ad00acfdbc7330f13b9303ec3d2475c7ece62c0d6b6eb368d758ab978"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.947161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" event={"ID":"534e3d1d-c71c-44e9-a589-b198ee8fa4e0","Type":"ContainerStarted","Data":"c936b23faa55d6fb5989148b45d2234283d7012e991327b83f2f95cb7103b77c"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.950081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.950398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" event={"ID":"d826b104-632f-4c53-9f01-04a7efdfc3c0","Type":"ContainerStarted","Data":"9ca6a240889909e071491cac3bb1576dffd1284b4d7854011a21979774c7ff71"} Nov 29 01:13:21 crc kubenswrapper[4749]: E1129 01:13:21.950646 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.4506191 +0000 UTC m=+145.622769097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.954394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" event={"ID":"d15ea300-3b1a-46ab-b775-d771839bb2ee","Type":"ContainerStarted","Data":"16cdcabfb86d2b4e93527ecf01aa192588e774140efe8d0767b21f8be28c17a3"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.960450 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" event={"ID":"07e6f770-684e-4d69-9d50-f85af092d6bc","Type":"ContainerStarted","Data":"655eb8f95ac013ebf8b0ed1a14677fd4d255c0bfb1f6edc0d0641a42f47fa24e"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.968058 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" event={"ID":"18a54704-5d0d-4caa-b347-2751a525a666","Type":"ContainerStarted","Data":"798ac6c419255f692948278efbd810990c6825583deb2c1e972ab8a33f316ac1"} Nov 29 01:13:21 crc kubenswrapper[4749]: I1129 01:13:21.970587 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh"] Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.020557 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs"] Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.053736 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.056393 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.556350159 +0000 UTC m=+145.728500016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: W1129 01:13:22.131881 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0656a177_96a1_410c_ad6b_c059d91f58c0.slice/crio-d65c285fae16aea5af91a59c94c0d0913974b4ab867dc9f81b8f7638b347cd15 WatchSource:0}: Error finding container d65c285fae16aea5af91a59c94c0d0913974b4ab867dc9f81b8f7638b347cd15: Status 404 returned error can't find the container with id d65c285fae16aea5af91a59c94c0d0913974b4ab867dc9f81b8f7638b347cd15 Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.157922 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.158309 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.658288374 +0000 UTC m=+145.830438221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: W1129 01:13:22.160693 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2a420e5_d254_4e2f_b585_77884028379f.slice/crio-2dcdf9543062db861c08a05c6a5d94c93ec8087ff1a2b1152f8d02977c9c6ec0 WatchSource:0}: Error finding container 2dcdf9543062db861c08a05c6a5d94c93ec8087ff1a2b1152f8d02977c9c6ec0: Status 404 returned error can't find the container with id 2dcdf9543062db861c08a05c6a5d94c93ec8087ff1a2b1152f8d02977c9c6ec0 Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.226003 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn"] Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.265138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.265512 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.765489327 +0000 UTC m=+145.937639184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.283277 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-klx5g"] Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.366687 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.367158 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.867133343 +0000 UTC m=+146.039283200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.471474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.472365 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:22.972347256 +0000 UTC m=+146.144497113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.572667 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.572829 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.072805386 +0000 UTC m=+146.244955243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.579684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.580082 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.080066384 +0000 UTC m=+146.252216241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: W1129 01:13:22.650120 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefe19f2e_05c9_4ebb_a96a_9bdb02cebba8.slice/crio-f5b06bd016653980d82de75e75738fbac6a9b4c9be63792339ad8bcbb98cd339 WatchSource:0}: Error finding container f5b06bd016653980d82de75e75738fbac6a9b4c9be63792339ad8bcbb98cd339: Status 404 returned error can't find the container with id f5b06bd016653980d82de75e75738fbac6a9b4c9be63792339ad8bcbb98cd339 Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.683069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.683265 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.183234546 +0000 UTC m=+146.355384403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.683502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.683872 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.183863645 +0000 UTC m=+146.356013502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.800491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.809258 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.309234434 +0000 UTC m=+146.481384291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.813617 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw"] Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.819559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.820316 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.320297666 +0000 UTC m=+146.492447523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:22 crc kubenswrapper[4749]: I1129 01:13:22.921490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:22 crc kubenswrapper[4749]: E1129 01:13:22.922297 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.422268392 +0000 UTC m=+146.594418249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.006594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" event={"ID":"0656a177-96a1-410c-ad6b-c059d91f58c0","Type":"ContainerStarted","Data":"d65c285fae16aea5af91a59c94c0d0913974b4ab867dc9f81b8f7638b347cd15"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.016837 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xlc5j" podStartSLOduration=123.016822355 podStartE2EDuration="2m3.016822355s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.01434818 +0000 UTC m=+146.186498037" watchObservedRunningTime="2025-11-29 01:13:23.016822355 +0000 UTC m=+146.188972212" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.020886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" event={"ID":"ccc661b4-9b29-4855-bad5-9973dc692c59","Type":"ContainerStarted","Data":"d7bb84cbadddd47fdba21156498d8d4a7e02dbc423e878e05e6758ca89ac9f32"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.023013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.023036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.023062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.023096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.023119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.023454 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.523441714 +0000 UTC m=+146.695591561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.025509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.042915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" event={"ID":"d4fc6a25-6ce7-42e5-8bf6-fb993ca7f5a4","Type":"ContainerStarted","Data":"61db7cb65e5e0b9fa4e3e4def779e741394779737637fec125ca20dd57175ac3"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.093653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.097724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.099867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.127353 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-ttpw6" podStartSLOduration=123.127328717 podStartE2EDuration="2m3.127328717s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.120344517 +0000 UTC m=+146.292494384" watchObservedRunningTime="2025-11-29 01:13:23.127328717 +0000 UTC m=+146.299478574" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.127612 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.128975 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.628958476 +0000 UTC m=+146.801108333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.177679 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" podStartSLOduration=123.17765768 podStartE2EDuration="2m3.17765768s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.17165781 +0000 UTC m=+146.343807667" watchObservedRunningTime="2025-11-29 01:13:23.17765768 +0000 UTC m=+146.349807537" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.253695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.254121 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.754107849 +0000 UTC m=+146.926257706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.288733 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fclkv" podStartSLOduration=123.288713599 podStartE2EDuration="2m3.288713599s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.25449272 +0000 UTC m=+146.426642567" watchObservedRunningTime="2025-11-29 01:13:23.288713599 +0000 UTC m=+146.460863456" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.310753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.336667 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.338093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.362223 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.363401 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.863383054 +0000 UTC m=+147.035532911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" event={"ID":"0d0ee540-2ed0-4074-91a7-7498e7d1da8d","Type":"ContainerStarted","Data":"cfd58f792bd9cbef06a25dabb837da5310f4ba605cf82ac347da74a3fc7a6d1c"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" event={"ID":"a2a420e5-d254-4e2f-b585-77884028379f","Type":"ContainerStarted","Data":"2dcdf9543062db861c08a05c6a5d94c93ec8087ff1a2b1152f8d02977c9c6ec0"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373282 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-klx5g" event={"ID":"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8","Type":"ContainerStarted","Data":"f5b06bd016653980d82de75e75738fbac6a9b4c9be63792339ad8bcbb98cd339"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373306 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373321 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc"] Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373337 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wm9jp"] Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" event={"ID":"0e05f53f-1275-42b8-8d25-1b6f96be0121","Type":"ContainerStarted","Data":"ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" event={"ID":"d826b104-632f-4c53-9f01-04a7efdfc3c0","Type":"ContainerStarted","Data":"f6e6aa494341cdab5aa93cc89764fe5eb33240ee9a74c8cb9e0c6fdbb2ce722b"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.373377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bjgxm" event={"ID":"cc13c12a-1e6f-48a7-b1ba-3bc1a83d027f","Type":"ContainerStarted","Data":"2801cb45ce0247afc7ce2c480a45822a7179e4415b223c2dcc99a6fe04f70d3a"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.398508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" event={"ID":"f2486a39-f7ad-443e-b243-4e7c6084e3ac","Type":"ContainerStarted","Data":"9f0e23af0546b73c17e2142b27d80698c67deb104ee62b638f9ac12b56643341"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.409320 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bjgxm" podStartSLOduration=123.409300083 podStartE2EDuration="2m3.409300083s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.407017525 +0000 UTC m=+146.579167382" watchObservedRunningTime="2025-11-29 01:13:23.409300083 +0000 UTC m=+146.581449930" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.410155 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" podStartSLOduration=123.410148329 podStartE2EDuration="2m3.410148329s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.297621327 +0000 UTC m=+146.469771194" watchObservedRunningTime="2025-11-29 01:13:23.410148329 +0000 UTC m=+146.582298186" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.419170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" event={"ID":"f4874dba-b38e-4f76-8b77-4bd2c58596d3","Type":"ContainerStarted","Data":"c4d33560b513b9422a2777f607ff261384d1a06021be48f41d1a3d31dfee99f2"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.431893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" event={"ID":"98ac3e00-8bfb-4b64-9b87-e71416270280","Type":"ContainerStarted","Data":"88ddecda7cdb7bfbec6bc59c88a6cb3d5ee502f4c993fe3c7e9082f7e15158c1"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.450007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" event={"ID":"c1c95d96-1bbb-4e89-a92c-5df613dc068c","Type":"ContainerStarted","Data":"73e2d390d4abb555f10d809304dd09c0fbfa008f3d3b9b5dec1b519fb81e5924"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.452358 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-kjzmw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.452397 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" podUID="c1c95d96-1bbb-4e89-a92c-5df613dc068c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.452780 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.465015 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.467502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5t7v2" event={"ID":"f1015f6f-5ba5-472f-8a50-4b16bfdbf016","Type":"ContainerStarted","Data":"e2f609cf06be0a27583616f401e6da7f3b599bb3d681c02b4398e7475adf4c9c"} Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.469055 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:23.969032279 +0000 UTC m=+147.141182136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.469977 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hrpj2" podStartSLOduration=123.469955047 podStartE2EDuration="2m3.469955047s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.450642226 +0000 UTC m=+146.622792083" watchObservedRunningTime="2025-11-29 01:13:23.469955047 +0000 UTC m=+146.642104904" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.498170 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" podStartSLOduration=123.498152745 podStartE2EDuration="2m3.498152745s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.496819225 +0000 UTC m=+146.668969082" watchObservedRunningTime="2025-11-29 01:13:23.498152745 +0000 UTC m=+146.670302612" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.504794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6l4z5" event={"ID":"608ac85c-70c4-4426-b036-e80c45b4bc64","Type":"ContainerStarted","Data":"6a64b41d473c7b0d907bdfc39256fcfe5d6c49f02ec1af3371159cae9f910b75"} Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.519499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.553248 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-989w8" podStartSLOduration=123.55322863 podStartE2EDuration="2m3.55322863s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:23.520212888 +0000 UTC m=+146.692362755" watchObservedRunningTime="2025-11-29 01:13:23.55322863 +0000 UTC m=+146.725378477" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.566643 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.567719 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.067681325 +0000 UTC m=+147.239831182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.624506 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.670285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.673465 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.173445025 +0000 UTC m=+147.345594882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.771738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.771955 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.271929076 +0000 UTC m=+147.444078933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.772435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.772717 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.272706019 +0000 UTC m=+147.444855876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.790517 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-4drv6"] Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.835142 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s"] Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.875902 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.876334 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.376317634 +0000 UTC m=+147.548467491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.906272 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-tmgqj"] Nov 29 01:13:23 crc kubenswrapper[4749]: W1129 01:13:23.957390 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39f0437d_74c4_4178_84ba_4c76299aa7e8.slice/crio-b45cb0c5a1be946932b62889cb89d9afdc2580cb379de2e8a0ccaf53c01e3930 WatchSource:0}: Error finding container b45cb0c5a1be946932b62889cb89d9afdc2580cb379de2e8a0ccaf53c01e3930: Status 404 returned error can't find the container with id b45cb0c5a1be946932b62889cb89d9afdc2580cb379de2e8a0ccaf53c01e3930 Nov 29 01:13:23 crc kubenswrapper[4749]: I1129 01:13:23.983379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:23 crc kubenswrapper[4749]: E1129 01:13:23.983851 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.483832976 +0000 UTC m=+147.655982833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.025448 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47pxv"] Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.028720 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs"] Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.089320 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.589299337 +0000 UTC m=+147.761449194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.089356 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.089548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.089939 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.589931696 +0000 UTC m=+147.762081553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.120534 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bjgxm" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.155417 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hxg42"] Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.192559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.193026 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.693003655 +0000 UTC m=+147.865153512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.294141 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.294996 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.794979651 +0000 UTC m=+147.967129498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.392322 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tcjx5"] Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.396622 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.396889 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:24.896874684 +0000 UTC m=+148.069024541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.421397 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl"] Nov 29 01:13:24 crc kubenswrapper[4749]: W1129 01:13:24.473840 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a077b1a_6a10_45ad_bff5_d54e5d0fe1cb.slice/crio-60a74daf68bd36e2a2cc137b8c6345823767acce3afe77d15b73bea5d748ddf1 WatchSource:0}: Error finding container 60a74daf68bd36e2a2cc137b8c6345823767acce3afe77d15b73bea5d748ddf1: Status 404 returned error can't find the container with id 60a74daf68bd36e2a2cc137b8c6345823767acce3afe77d15b73bea5d748ddf1 Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.478882 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn"] Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.505052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.505498 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.005485019 +0000 UTC m=+148.177634876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.524720 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xxssl"] Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.580832 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9"] Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.581878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" event={"ID":"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb","Type":"ContainerStarted","Data":"60a74daf68bd36e2a2cc137b8c6345823767acce3afe77d15b73bea5d748ddf1"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.599098 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jgbtg"] Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.605933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.606423 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.106400583 +0000 UTC m=+148.278550430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.613841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" event={"ID":"d34de2ad-4a60-49b0-b63b-5f610370bbd4","Type":"ContainerStarted","Data":"7277fa7bfb7a891f31b6f55e24826af3efd14671526c72a089706a3acbe937c3"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.625328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" event={"ID":"f1727981-d091-4cc0-a8f1-a123ee72c1ca","Type":"ContainerStarted","Data":"78cb2c897a8e916b1a7c1c9788fa02b990fd554f68cc40c8283f7648c71502ef"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.654420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" event={"ID":"a19644e5-e46c-4286-8f46-5022e2bb45b4","Type":"ContainerStarted","Data":"6f02334ab2dedf4ccd1a3b9c9e08c178da62fff9e8336f59b932280ebba04539"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.672456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8cq9l" event={"ID":"ceffc235-a012-44b5-91ff-f45a19502453","Type":"ContainerStarted","Data":"bc712f09ccb381feb0eabdfe6b0eb55b090e09f5c4208fa6beb99650ed8cecb6"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.673585 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8cq9l" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.687162 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cq9l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.687273 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8cq9l" podUID="ceffc235-a012-44b5-91ff-f45a19502453" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.710557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.712394 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.212376309 +0000 UTC m=+148.384526166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.725496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" event={"ID":"2cf44185-40f9-4d15-93a0-13604622e06f","Type":"ContainerStarted","Data":"3ccad68fc65e3aa03bc6943c421b50a0adb8c22719504ac98a0dbad1d794b5cf"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.736944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5t7v2" event={"ID":"f1015f6f-5ba5-472f-8a50-4b16bfdbf016","Type":"ContainerStarted","Data":"744bb620d74ee5f0c31d66ede9e1c4d8b08b074c05cf7decbb55ebf3a1f02b3f"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.740227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" event={"ID":"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917","Type":"ContainerStarted","Data":"1c173ef22cd2a1d3a31bd3c2293672c8b2fd997d8673575e9245d2e46de4434d"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.740977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" event={"ID":"4fc319cf-59a5-496d-9970-3ce233ceba43","Type":"ContainerStarted","Data":"053049d7ee3f4236c5a6652ffd72838e8ba71fef7c046b3990a581e7626fe276"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.741590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" event={"ID":"2879ffd3-fbd8-4da2-9c3d-174c70f418b4","Type":"ContainerStarted","Data":"d0db136ffe1c61f23109023f89f3a6058f9f53afc65874dfbded2e7ac0f51588"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.748473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" event={"ID":"0b4e1a07-c117-40a3-8451-db2d6a18a98a","Type":"ContainerStarted","Data":"d234cc1affeecdfd0a9a2f16fb0650a39f2e8be3991e3a25b09db69b1137124f"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.749501 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-2mp4r" podStartSLOduration=124.749488485 podStartE2EDuration="2m4.749488485s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:24.71406955 +0000 UTC m=+147.886219407" watchObservedRunningTime="2025-11-29 01:13:24.749488485 +0000 UTC m=+147.921638342" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.801806 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" event={"ID":"18a54704-5d0d-4caa-b347-2751a525a666","Type":"ContainerStarted","Data":"29379f81ce13c0c793f2d98b8585314936b8621dda2b85721f1ef40500648f9e"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.812332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" event={"ID":"39f0437d-74c4-4178-84ba-4c76299aa7e8","Type":"ContainerStarted","Data":"b45cb0c5a1be946932b62889cb89d9afdc2580cb379de2e8a0ccaf53c01e3930"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.812873 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.814114 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.314099337 +0000 UTC m=+148.486249194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.822991 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8cq9l" podStartSLOduration=124.822977644 podStartE2EDuration="2m4.822977644s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:24.771578809 +0000 UTC m=+147.943728676" watchObservedRunningTime="2025-11-29 01:13:24.822977644 +0000 UTC m=+147.995127501" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.824684 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" podStartSLOduration=125.824679376 podStartE2EDuration="2m5.824679376s" podCreationTimestamp="2025-11-29 01:11:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:24.822518011 +0000 UTC m=+147.994667868" watchObservedRunningTime="2025-11-29 01:13:24.824679376 +0000 UTC m=+147.996829233" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.834280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" event={"ID":"1d3b4d38-c668-479d-9ebc-aea9febe04bc","Type":"ContainerStarted","Data":"270e8f060a0ef9cf70632db5225163a11065679f97694280bf8b0e12e563b24d"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.863531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" event={"ID":"4c2db067-e1b5-48ba-bc7e-3df5e6c8feb8","Type":"ContainerStarted","Data":"00a38378e04f32e02262c6d2f2688d963910711fd7d61604366af9c867b2600e"} Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.863572 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.867507 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5t7v2" podStartSLOduration=124.867492773 podStartE2EDuration="2m4.867492773s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:24.864818802 +0000 UTC m=+148.036968659" watchObservedRunningTime="2025-11-29 01:13:24.867492773 +0000 UTC m=+148.039642630" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.912682 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" podStartSLOduration=124.912667271 podStartE2EDuration="2m4.912667271s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:24.910426114 +0000 UTC m=+148.082575981" watchObservedRunningTime="2025-11-29 01:13:24.912667271 +0000 UTC m=+148.084817128" Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.914366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:24 crc kubenswrapper[4749]: E1129 01:13:24.916661 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.41664628 +0000 UTC m=+148.588796137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:24 crc kubenswrapper[4749]: I1129 01:13:24.973939 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" podStartSLOduration=124.973920352 podStartE2EDuration="2m4.973920352s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:24.973388926 +0000 UTC m=+148.145538793" watchObservedRunningTime="2025-11-29 01:13:24.973920352 +0000 UTC m=+148.146070229" Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.020343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.020769 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.52075466 +0000 UTC m=+148.692904517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.122323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.123235 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.623220071 +0000 UTC m=+148.795369928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.225998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.226799 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.726779605 +0000 UTC m=+148.898929462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.330629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.331312 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.831296867 +0000 UTC m=+149.003446724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.374216 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.374292 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.434499 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.434835 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:25.934820289 +0000 UTC m=+149.106970136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.455686 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-kjzmw" Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.529612 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.536572 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:25 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:25 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:25 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.536644 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.537361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.537824 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.037808355 +0000 UTC m=+149.209958212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.639664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.639874 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.139827822 +0000 UTC m=+149.311977679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.640029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.642648 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.142623046 +0000 UTC m=+149.314772903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.743818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.744808 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.244789078 +0000 UTC m=+149.416938935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.807634 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.846146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.846357 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.346337091 +0000 UTC m=+149.518486988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.896634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" event={"ID":"ccc661b4-9b29-4855-bad5-9973dc692c59","Type":"ContainerStarted","Data":"cb6cad58c6fee546488cddf898b5d7f5d914a3c8ce2e9b5750f8805abd9abeff"} Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.906479 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" event={"ID":"534e3d1d-c71c-44e9-a589-b198ee8fa4e0","Type":"ContainerStarted","Data":"a6ed0dd599712df3d1631356e5a768c5318eed3ba5748a470afc3d46e8565edb"} Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.909150 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" event={"ID":"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917","Type":"ContainerStarted","Data":"9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3"} Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.909746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.930889 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vz6bd" podStartSLOduration=125.930858362 podStartE2EDuration="2m5.930858362s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:25.928499211 +0000 UTC m=+149.100649068" watchObservedRunningTime="2025-11-29 01:13:25.930858362 +0000 UTC m=+149.103008219" Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.956124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:25 crc kubenswrapper[4749]: E1129 01:13:25.957702 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.457679288 +0000 UTC m=+149.629829145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.995119 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d0ee540-2ed0-4074-91a7-7498e7d1da8d" containerID="58c916137e24b7b803218464cea518fe7250ba2546fcea5837d8a0d52ff769d6" exitCode=0 Nov 29 01:13:25 crc kubenswrapper[4749]: I1129 01:13:25.995327 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" event={"ID":"0d0ee540-2ed0-4074-91a7-7498e7d1da8d","Type":"ContainerDied","Data":"58c916137e24b7b803218464cea518fe7250ba2546fcea5837d8a0d52ff769d6"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:25.999788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" event={"ID":"07e6f770-684e-4d69-9d50-f85af092d6bc","Type":"ContainerStarted","Data":"cff0eebba3e64ad9c376948105c4285e60d7eea908e2e209030f5e740394c304"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.011046 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" event={"ID":"18a54704-5d0d-4caa-b347-2751a525a666","Type":"ContainerStarted","Data":"25fc476b3211d4a2f57c41d819ca2f37a6be3540b587eecb147d835ecca4d31d"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.030690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-klx5g" event={"ID":"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8","Type":"ContainerStarted","Data":"9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.031375 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" podStartSLOduration=126.031350673 podStartE2EDuration="2m6.031350673s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:25.964495253 +0000 UTC m=+149.136645110" watchObservedRunningTime="2025-11-29 01:13:26.031350673 +0000 UTC m=+149.203500530" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.033954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" event={"ID":"b8b9a5db-5c5d-4cd8-be74-52305dfb6665","Type":"ContainerStarted","Data":"ae27763f9a291d3093b33a43a71d3957069e4f4426f81341179a7319abdfe82e"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.060726 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.062905 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.562885121 +0000 UTC m=+149.735035168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.085750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" event={"ID":"f1727981-d091-4cc0-a8f1-a123ee72c1ca","Type":"ContainerStarted","Data":"94eb1ce5914ac84acf114ff588e0faabdbeca9e8abce1fec29275c4592bf7116"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.086852 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.098210 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-klx5g" podStartSLOduration=126.098165612 podStartE2EDuration="2m6.098165612s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.094916004 +0000 UTC m=+149.267065861" watchObservedRunningTime="2025-11-29 01:13:26.098165612 +0000 UTC m=+149.270315469" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.125100 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" podStartSLOduration=126.125078331 podStartE2EDuration="2m6.125078331s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.12472329 +0000 UTC m=+149.296873157" watchObservedRunningTime="2025-11-29 01:13:26.125078331 +0000 UTC m=+149.297228188" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.140492 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2hwrs" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.161962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" event={"ID":"a2a420e5-d254-4e2f-b585-77884028379f","Type":"ContainerStarted","Data":"972a113711a2a1897f08eb769ee016767fd7090ae08024599176f842ae87630f"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.162315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.163489 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.663470845 +0000 UTC m=+149.835620712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.164014 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wfblw" podStartSLOduration=126.164002741 podStartE2EDuration="2m6.164002741s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.162125485 +0000 UTC m=+149.334275362" watchObservedRunningTime="2025-11-29 01:13:26.164002741 +0000 UTC m=+149.336152598" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.186870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" event={"ID":"d15ea300-3b1a-46ab-b775-d771839bb2ee","Type":"ContainerStarted","Data":"afc61abdb874e37bb15aef777a1c7586947350d2cd4b0c51cf9fae64f035774c"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.205361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"42422eb8ed380de37f826aa198429010816fb8305b4c7fdd60f525d46bc64dc4"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.206115 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gkhzs" podStartSLOduration=126.206087036 podStartE2EDuration="2m6.206087036s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.1899025 +0000 UTC m=+149.362052357" watchObservedRunningTime="2025-11-29 01:13:26.206087036 +0000 UTC m=+149.378236893" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.227574 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" event={"ID":"2879ffd3-fbd8-4da2-9c3d-174c70f418b4","Type":"ContainerStarted","Data":"a4518125ef3be9294d33d04851911fbf403eaed6e2e86b845594513573732250"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.238473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0a97ca6febcc42006be371e4bbf9a230fe56db90a2a895fe382f55729d665281"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.240910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" event={"ID":"78e8e60a-3e83-41f4-8e5d-e502d05118ac","Type":"ContainerStarted","Data":"17c5f7a9235e0f2276672e673e56683434ea23aa810a59829174c846c182e9d7"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.240932 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" event={"ID":"78e8e60a-3e83-41f4-8e5d-e502d05118ac","Type":"ContainerStarted","Data":"0fa4ebfd5e08653ceead86039aed010b53860df4a29c94cd72eb90566645e189"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.267627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.269688 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.769664708 +0000 UTC m=+149.941814565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.298218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" event={"ID":"8e8ea46b-ee66-4110-a123-dfe393cb4abf","Type":"ContainerStarted","Data":"e5435ae5be1f769404b94db1b1ce0f51b5beacb24fc8f8eea0554f0fefccc46a"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.299032 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.303654 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-hj2jl" podStartSLOduration=126.303630089 podStartE2EDuration="2m6.303630089s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.286637228 +0000 UTC m=+149.458787085" watchObservedRunningTime="2025-11-29 01:13:26.303630089 +0000 UTC m=+149.475779946" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.324731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" event={"ID":"2cf44185-40f9-4d15-93a0-13604622e06f","Type":"ContainerStarted","Data":"f22f79437c777e2c389216011056a2d471465b655bdb3c16cfd4cd502d6055c6"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.325020 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5pbzn container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.325058 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" podUID="8e8ea46b-ee66-4110-a123-dfe393cb4abf" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.333687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" event={"ID":"1d3b4d38-c668-479d-9ebc-aea9febe04bc","Type":"ContainerStarted","Data":"7552a7afc3e8e46d84edec2043ad9992c9eb2cc7f51efb4a24763b83de2c812a"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.351417 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" podStartSLOduration=126.351397625 podStartE2EDuration="2m6.351397625s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.348260041 +0000 UTC m=+149.520409908" watchObservedRunningTime="2025-11-29 01:13:26.351397625 +0000 UTC m=+149.523547482" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.371402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.373816 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.873795839 +0000 UTC m=+150.045945696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.378184 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2s7dc" podStartSLOduration=126.3781606 podStartE2EDuration="2m6.3781606s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.373682075 +0000 UTC m=+149.545831932" watchObservedRunningTime="2025-11-29 01:13:26.3781606 +0000 UTC m=+149.550310457" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.384766 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" event={"ID":"d826b104-632f-4c53-9f01-04a7efdfc3c0","Type":"ContainerStarted","Data":"bcdd53e21a91b7ea2158c7acc2b207c7bf9824fdb51c707b5cbff8a348bea518"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.439969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-6l4z5" event={"ID":"608ac85c-70c4-4426-b036-e80c45b4bc64","Type":"ContainerStarted","Data":"518a02c664ad7e12b3d1e2fede4a2b772db8132368773a5df2a53f6b1975826e"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.462504 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6nl9" podStartSLOduration=126.462486315 podStartE2EDuration="2m6.462486315s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.41741002 +0000 UTC m=+149.589559877" watchObservedRunningTime="2025-11-29 01:13:26.462486315 +0000 UTC m=+149.634636172" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.463122 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-6l4z5" podStartSLOduration=8.463118004 podStartE2EDuration="8.463118004s" podCreationTimestamp="2025-11-29 01:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.462446354 +0000 UTC m=+149.634596211" watchObservedRunningTime="2025-11-29 01:13:26.463118004 +0000 UTC m=+149.635267861" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.473013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.475129 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:26.975117995 +0000 UTC m=+150.147267852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.502388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jgbtg" event={"ID":"6ccb5157-d0d7-4fbc-821b-cd249461519b","Type":"ContainerStarted","Data":"dd4ca7dafa4e3f5a6a0bb6a9f3ee0cdb1e23738945c09db35da1229b18ed3b77"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.542341 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:26 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:26 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:26 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.542395 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.547517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" event={"ID":"0656a177-96a1-410c-ad6b-c059d91f58c0","Type":"ContainerStarted","Data":"17deff8de278ab7121b5c8562b5175fcf9b4541b26ffc69af9aebeefc57b2dcf"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.575702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.576112 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.076096621 +0000 UTC m=+150.248246478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.580131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" event={"ID":"0b4e1a07-c117-40a3-8451-db2d6a18a98a","Type":"ContainerStarted","Data":"95a8715f78dd34f43d7bcefc1214bb96860ca727a6a502452c6a613925f4c559"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.639522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" event={"ID":"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59","Type":"ContainerStarted","Data":"a0d6db6b15ff2ae85445ac7edd92883e9e02af9779c2ad8bd9b0c74647d55734"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.653066 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-q48hh" podStartSLOduration=126.653048424 podStartE2EDuration="2m6.653048424s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.598246466 +0000 UTC m=+149.770396333" watchObservedRunningTime="2025-11-29 01:13:26.653048424 +0000 UTC m=+149.825198281" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.678149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.678449 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.178436107 +0000 UTC m=+150.350585964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.712655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" event={"ID":"39f0437d-74c4-4178-84ba-4c76299aa7e8","Type":"ContainerStarted","Data":"354dd4ffcd5a2f216c86ca6a714dfe5441a0a63a9cf70a670d2f50b5900f1ec5"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.760872 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-tmgqj" podStartSLOduration=126.760853275 podStartE2EDuration="2m6.760853275s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.672668084 +0000 UTC m=+149.844817941" watchObservedRunningTime="2025-11-29 01:13:26.760853275 +0000 UTC m=+149.933003132" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.777634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" event={"ID":"d34de2ad-4a60-49b0-b63b-5f610370bbd4","Type":"ContainerStarted","Data":"b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.778646 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.780165 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.280149895 +0000 UTC m=+150.452299752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.780631 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.815332 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-47pxv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.815389 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.817787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9d27063cb07445a18005d735c54efe7b6cd566ad5952e18a4e12f20bbb942f38"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.818399 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.826742 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-4drv6" podStartSLOduration=126.826725345 podStartE2EDuration="2m6.826725345s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.761566386 +0000 UTC m=+149.933716243" watchObservedRunningTime="2025-11-29 01:13:26.826725345 +0000 UTC m=+149.998875202" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.827209 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" podStartSLOduration=126.827189219 podStartE2EDuration="2m6.827189219s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.825092036 +0000 UTC m=+149.997241903" watchObservedRunningTime="2025-11-29 01:13:26.827189219 +0000 UTC m=+149.999339076" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.837850 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xxssl" event={"ID":"379d4463-1203-41ec-8a7d-f8474a23e7b8","Type":"ContainerStarted","Data":"21cb3a3a6573cdfaadda848a1cb9b7306c12f047cdaa73d9953ee5350334eb8a"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.857522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" event={"ID":"f2486a39-f7ad-443e-b243-4e7c6084e3ac","Type":"ContainerStarted","Data":"ab2e7b134b9f7555a93370b7afe318a67a935b4c43fa82975328d8b7c9027747"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.857578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" event={"ID":"f2486a39-f7ad-443e-b243-4e7c6084e3ac","Type":"ContainerStarted","Data":"c3fdd83f5ed34a9d8297bce76d291592149cd1bcdc3c49d98220f3bc15feecd3"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.883432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.883737 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.383725759 +0000 UTC m=+150.555875616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.891016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" event={"ID":"4fc319cf-59a5-496d-9970-3ce233ceba43","Type":"ContainerStarted","Data":"a353d175177e73feb4632038e7183317d9ea67dfe4354164b69b46bd3203984c"} Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.894205 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cq9l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.894246 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8cq9l" podUID="ceffc235-a012-44b5-91ff-f45a19502453" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.904345 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xxssl" podStartSLOduration=8.904326908 podStartE2EDuration="8.904326908s" podCreationTimestamp="2025-11-29 01:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.902597316 +0000 UTC m=+150.074747183" watchObservedRunningTime="2025-11-29 01:13:26.904326908 +0000 UTC m=+150.076476765" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.915278 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wm9jp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.915332 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" podUID="6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.929545 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hxg42" podStartSLOduration=126.929527836 podStartE2EDuration="2m6.929527836s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.926529626 +0000 UTC m=+150.098679483" watchObservedRunningTime="2025-11-29 01:13:26.929527836 +0000 UTC m=+150.101677683" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.958791 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vpqvw" podStartSLOduration=126.958771885 podStartE2EDuration="2m6.958771885s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:26.956803896 +0000 UTC m=+150.128953753" watchObservedRunningTime="2025-11-29 01:13:26.958771885 +0000 UTC m=+150.130921742" Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.984741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.984865 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.484841558 +0000 UTC m=+150.656991415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:26 crc kubenswrapper[4749]: I1129 01:13:26.986472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:26 crc kubenswrapper[4749]: E1129 01:13:26.987329 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.487300762 +0000 UTC m=+150.659450689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.087758 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.088187 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.588159834 +0000 UTC m=+150.760309691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.191002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.191541 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.691517832 +0000 UTC m=+150.863667689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.291769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.291939 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.79191552 +0000 UTC m=+150.964065377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.292432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.293123 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.793096895 +0000 UTC m=+150.965246752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.393324 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.393597 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.893580956 +0000 UTC m=+151.065730813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.495764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.496187 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:27.996167511 +0000 UTC m=+151.168317368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.536212 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:27 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:27 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:27 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.536297 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.598913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.599180 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.099139166 +0000 UTC m=+151.271289023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.599362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.599802 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.099794846 +0000 UTC m=+151.271944703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.700657 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.700850 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.200823383 +0000 UTC m=+151.372973240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.700927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.701264 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.201245636 +0000 UTC m=+151.373395493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.801585 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.801850 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.301795369 +0000 UTC m=+151.473945226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.801952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.802325 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.302309624 +0000 UTC m=+151.474459481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.835781 4749 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.898551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" event={"ID":"0d0ee540-2ed0-4074-91a7-7498e7d1da8d","Type":"ContainerStarted","Data":"4b90158692f57014ff4fa2253783a1d92add1ad65da9fe6db978e9e0aeb208d2"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.900301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" event={"ID":"2879ffd3-fbd8-4da2-9c3d-174c70f418b4","Type":"ContainerStarted","Data":"2f2cc7cec7d84c94b49941f2ba5e200750fae2d25f22d2f28ef3a69b85f994a9"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.902031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" event={"ID":"d15ea300-3b1a-46ab-b775-d771839bb2ee","Type":"ContainerStarted","Data":"84fc70d3527bd0560bceea72c31857e91fd3494f06c3eea472f1348e73f9dd05"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.903020 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.903179 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.403157376 +0000 UTC m=+151.575307233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.903350 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:27 crc kubenswrapper[4749]: E1129 01:13:27.903716 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.403707583 +0000 UTC m=+151.575857440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.904627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" event={"ID":"07e6f770-684e-4d69-9d50-f85af092d6bc","Type":"ContainerStarted","Data":"18582dca3ea1ff1ce9607955b1eeea9e5c8018f5f76ed41c94a596c364590d01"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.906280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4e819700c123c26cfde216bfa07af743d497fed42d8b7279243b69ed79b65c53"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.908002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" event={"ID":"8e8ea46b-ee66-4110-a123-dfe393cb4abf","Type":"ContainerStarted","Data":"38f0c4ccb4efdd44bb35215a0dd74bd06a0482a8c82625376d6725d2751c8f35"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.909881 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xxssl" event={"ID":"379d4463-1203-41ec-8a7d-f8474a23e7b8","Type":"ContainerStarted","Data":"b0fea00ce9dd6cace6f5f6abfe2d22cf902294039ba41f59a83065223055a151"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.912468 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" event={"ID":"534e3d1d-c71c-44e9-a589-b198ee8fa4e0","Type":"ContainerStarted","Data":"de628ddd795e63254a69594780e54a00247a34854922dfb23d5fbebc15a25696"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.914956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jgbtg" event={"ID":"6ccb5157-d0d7-4fbc-821b-cd249461519b","Type":"ContainerStarted","Data":"bf6d583c5417a56d20180705c5c65adac84d06fe277c257306430cdbdd0b4c39"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.915025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jgbtg" event={"ID":"6ccb5157-d0d7-4fbc-821b-cd249461519b","Type":"ContainerStarted","Data":"8f8c67fe5edb931f59f3a4b41ec3e47c0d19d604840d8cdb03b98fb05904f9f2"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.915228 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.916732 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" event={"ID":"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb","Type":"ContainerStarted","Data":"5c69243eda97c37f8720153eab0994b0d1237d015f0b0b8309e723ec9b6178e0"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.916780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" event={"ID":"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb","Type":"ContainerStarted","Data":"ada60cf84736651917e12d2bf5bc2dfda15b6dac5b5ca8c953d76888f817dfbe"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.918755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" event={"ID":"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59","Type":"ContainerStarted","Data":"82914d0906a7768f9faff877e5136e248556db110fa020d59f3ac4b908c72939"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.918799 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" event={"ID":"e3bc3d9f-ca91-4ab2-99f2-0558be9adf59","Type":"ContainerStarted","Data":"36f4ec706daa99abc6e4fcffdf2db0c9b13d710ae0a770d847197cd11fe79da1"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.918904 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.920742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7f231a716eb77989338b4dfbf9bd427de321e5a2f3dfb6e9acce8ccc5033aaa8"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.922573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"85ea8cf9f593c0fa1b71d09f3545f0c622363f4ea36a9bcad4879d83e09756bb"} Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.923734 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-47pxv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.923809 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.924419 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cq9l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.924467 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8cq9l" podUID="ceffc235-a012-44b5-91ff-f45a19502453" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.929824 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5pbzn" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.935490 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.960598 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" podStartSLOduration=127.960577043 podStartE2EDuration="2m7.960577043s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:27.937652523 +0000 UTC m=+151.109802390" watchObservedRunningTime="2025-11-29 01:13:27.960577043 +0000 UTC m=+151.132726900" Nov 29 01:13:27 crc kubenswrapper[4749]: I1129 01:13:27.986697 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" podStartSLOduration=127.986677037 podStartE2EDuration="2m7.986677037s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:27.965902453 +0000 UTC m=+151.138052320" watchObservedRunningTime="2025-11-29 01:13:27.986677037 +0000 UTC m=+151.158826894" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.004519 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:28 crc kubenswrapper[4749]: E1129 01:13:28.004675 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.504650058 +0000 UTC m=+151.676799915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.004910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:28 crc kubenswrapper[4749]: E1129 01:13:28.008715 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.508696089 +0000 UTC m=+151.680845946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.029700 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2d5l2" podStartSLOduration=128.02967618 podStartE2EDuration="2m8.02967618s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:27.989125641 +0000 UTC m=+151.161275488" watchObservedRunningTime="2025-11-29 01:13:28.02967618 +0000 UTC m=+151.201826027" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.109837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:28 crc kubenswrapper[4749]: E1129 01:13:28.110307 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.610285854 +0000 UTC m=+151.782435711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.122079 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" podStartSLOduration=128.122056478 podStartE2EDuration="2m8.122056478s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:28.07656406 +0000 UTC m=+151.248713927" watchObservedRunningTime="2025-11-29 01:13:28.122056478 +0000 UTC m=+151.294206345" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.158871 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nhn9s" podStartSLOduration=128.158848104 podStartE2EDuration="2m8.158848104s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:28.147334117 +0000 UTC m=+151.319483984" watchObservedRunningTime="2025-11-29 01:13:28.158848104 +0000 UTC m=+151.330997971" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.189011 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jgbtg" podStartSLOduration=10.188974609 podStartE2EDuration="10.188974609s" podCreationTimestamp="2025-11-29 01:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:28.171511854 +0000 UTC m=+151.343661721" watchObservedRunningTime="2025-11-29 01:13:28.188974609 +0000 UTC m=+151.361124466" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.222469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:28 crc kubenswrapper[4749]: E1129 01:13:28.222927 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-29 01:13:28.7229109 +0000 UTC m=+151.895060757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hkqth" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.233512 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-rlstl" podStartSLOduration=128.233486688 podStartE2EDuration="2m8.233486688s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:28.222129896 +0000 UTC m=+151.394279753" watchObservedRunningTime="2025-11-29 01:13:28.233486688 +0000 UTC m=+151.405636565" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.278422 4749 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-29T01:13:27.8361038Z","Handler":null,"Name":""} Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.293942 4749 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.293995 4749 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.326769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.410227 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qr8r9"] Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.411134 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.415700 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.430435 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qr8r9"] Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.531127 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-utilities\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.531206 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6cj8\" (UniqueName: \"kubernetes.io/projected/1dd82406-f875-4ec7-bbe9-8424b2725f51-kube-api-access-p6cj8\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.531265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-catalog-content\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.533802 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:28 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:28 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:28 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.534070 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.581600 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5rwk9"] Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.582880 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.584053 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.584778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.604619 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rwk9"] Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-catalog-content\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632137 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-utilities\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-utilities\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6cj8\" (UniqueName: \"kubernetes.io/projected/1dd82406-f875-4ec7-bbe9-8424b2725f51-kube-api-access-p6cj8\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhpq2\" (UniqueName: \"kubernetes.io/projected/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-kube-api-access-lhpq2\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632366 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-catalog-content\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-catalog-content\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.632687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-utilities\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.634762 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.634792 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.656397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6cj8\" (UniqueName: \"kubernetes.io/projected/1dd82406-f875-4ec7-bbe9-8424b2725f51-kube-api-access-p6cj8\") pod \"community-operators-qr8r9\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.680672 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hkqth\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.734185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-utilities\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.734278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhpq2\" (UniqueName: \"kubernetes.io/projected/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-kube-api-access-lhpq2\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.734329 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-catalog-content\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.734938 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-utilities\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.734958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-catalog-content\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.747985 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.758977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhpq2\" (UniqueName: \"kubernetes.io/projected/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-kube-api-access-lhpq2\") pod \"certified-operators-5rwk9\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.799207 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bj6rw"] Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.800773 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.813983 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bj6rw"] Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.835585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-catalog-content\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.835667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-utilities\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.835699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v9zs\" (UniqueName: \"kubernetes.io/projected/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-kube-api-access-9v9zs\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.849362 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.896373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.936915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-catalog-content\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.936967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-utilities\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.936987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v9zs\" (UniqueName: \"kubernetes.io/projected/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-kube-api-access-9v9zs\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.949581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-utilities\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.949706 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-catalog-content\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.959480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" event={"ID":"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb","Type":"ContainerStarted","Data":"5b2df3b1945d1ef2f57f4f90dabf53d279fdfc5a0900488464d2c1703b4279b7"} Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.959552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" event={"ID":"8a077b1a-6a10-45ad-bff5-d54e5d0fe1cb","Type":"ContainerStarted","Data":"a4917605693ba483a86bd0bcc0f3812ad57c436e1d149f6ca2921844d69d3e68"} Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.975570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v9zs\" (UniqueName: \"kubernetes.io/projected/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-kube-api-access-9v9zs\") pod \"community-operators-bj6rw\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.976875 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.983910 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mh54f"] Nov 29 01:13:28 crc kubenswrapper[4749]: I1129 01:13:28.991052 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tcjx5" podStartSLOduration=10.991028422 podStartE2EDuration="10.991028422s" podCreationTimestamp="2025-11-29 01:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:28.986903408 +0000 UTC m=+152.159053275" watchObservedRunningTime="2025-11-29 01:13:28.991028422 +0000 UTC m=+152.163178279" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.002428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.039029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-catalog-content\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.039170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-utilities\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.039345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p562h\" (UniqueName: \"kubernetes.io/projected/f9c1388f-39e4-470f-a359-46e05c3963b0-kube-api-access-p562h\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.047808 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mh54f"] Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.086893 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.099553 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qr8r9"] Nov 29 01:13:29 crc kubenswrapper[4749]: W1129 01:13:29.110850 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd82406_f875_4ec7_bbe9_8424b2725f51.slice/crio-07d62d0b609c2270412b4eac3a202d2d200b78a1d479c13cd9180c95d8d9ecf2 WatchSource:0}: Error finding container 07d62d0b609c2270412b4eac3a202d2d200b78a1d479c13cd9180c95d8d9ecf2: Status 404 returned error can't find the container with id 07d62d0b609c2270412b4eac3a202d2d200b78a1d479c13cd9180c95d8d9ecf2 Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.120583 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.140873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-utilities\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.140934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p562h\" (UniqueName: \"kubernetes.io/projected/f9c1388f-39e4-470f-a359-46e05c3963b0-kube-api-access-p562h\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.141031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-catalog-content\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.141517 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-utilities\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.142014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-catalog-content\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.160388 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p562h\" (UniqueName: \"kubernetes.io/projected/f9c1388f-39e4-470f-a359-46e05c3963b0-kube-api-access-p562h\") pod \"certified-operators-mh54f\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.173440 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkqth"] Nov 29 01:13:29 crc kubenswrapper[4749]: W1129 01:13:29.191045 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice/crio-6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d WatchSource:0}: Error finding container 6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d: Status 404 returned error can't find the container with id 6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.258574 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5rwk9"] Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.356149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bj6rw"] Nov 29 01:13:29 crc kubenswrapper[4749]: W1129 01:13:29.364013 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dea467d_7ff5_488f_b8cd_0a7bb63b9ea5.slice/crio-5d0e8e995deb4fbed6060bbde014ea8392da25cc0dfdb030bb262d6f8dd80e68 WatchSource:0}: Error finding container 5d0e8e995deb4fbed6060bbde014ea8392da25cc0dfdb030bb262d6f8dd80e68: Status 404 returned error can't find the container with id 5d0e8e995deb4fbed6060bbde014ea8392da25cc0dfdb030bb262d6f8dd80e68 Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.367467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.534584 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:29 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:29 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:29 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.534650 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.590133 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mh54f"] Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.612386 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6qbnn" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.966043 4749 generic.go:334] "Generic (PLEG): container finished" podID="2cf44185-40f9-4d15-93a0-13604622e06f" containerID="f22f79437c777e2c389216011056a2d471465b655bdb3c16cfd4cd502d6055c6" exitCode=0 Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.966522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" event={"ID":"2cf44185-40f9-4d15-93a0-13604622e06f","Type":"ContainerDied","Data":"f22f79437c777e2c389216011056a2d471465b655bdb3c16cfd4cd502d6055c6"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.969603 4749 generic.go:334] "Generic (PLEG): container finished" podID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerID="0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869" exitCode=0 Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.969922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj6rw" event={"ID":"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5","Type":"ContainerDied","Data":"0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.969963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj6rw" event={"ID":"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5","Type":"ContainerStarted","Data":"5d0e8e995deb4fbed6060bbde014ea8392da25cc0dfdb030bb262d6f8dd80e68"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.972101 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.973436 4749 generic.go:334] "Generic (PLEG): container finished" podID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerID="5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4" exitCode=0 Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.973529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr8r9" event={"ID":"1dd82406-f875-4ec7-bbe9-8424b2725f51","Type":"ContainerDied","Data":"5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.973584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr8r9" event={"ID":"1dd82406-f875-4ec7-bbe9-8424b2725f51","Type":"ContainerStarted","Data":"07d62d0b609c2270412b4eac3a202d2d200b78a1d479c13cd9180c95d8d9ecf2"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.977460 4749 generic.go:334] "Generic (PLEG): container finished" podID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerID="635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d" exitCode=0 Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.977532 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwk9" event={"ID":"574f9eb6-8ce6-4f90-8e38-47be16ec96d1","Type":"ContainerDied","Data":"635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.977560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwk9" event={"ID":"574f9eb6-8ce6-4f90-8e38-47be16ec96d1","Type":"ContainerStarted","Data":"8b87e758bac0ee2acdc1b66d8438ec72517fb4ad79b82aaa1c5116128a7f68d8"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.980821 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" event={"ID":"21e2450f-f4fe-41bd-bbc9-abcc3f03400d","Type":"ContainerStarted","Data":"d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.980864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" event={"ID":"21e2450f-f4fe-41bd-bbc9-abcc3f03400d","Type":"ContainerStarted","Data":"6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.981566 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.987635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh54f" event={"ID":"f9c1388f-39e4-470f-a359-46e05c3963b0","Type":"ContainerStarted","Data":"8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb"} Nov 29 01:13:29 crc kubenswrapper[4749]: I1129 01:13:29.987677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh54f" event={"ID":"f9c1388f-39e4-470f-a359-46e05c3963b0","Type":"ContainerStarted","Data":"28b654b4fe3027b0cca02972c9d9726e84520f60b55fa6efb2282fd50a7fecbb"} Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.093663 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" podStartSLOduration=130.093644261 podStartE2EDuration="2m10.093644261s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:30.075776504 +0000 UTC m=+153.247926361" watchObservedRunningTime="2025-11-29 01:13:30.093644261 +0000 UTC m=+153.265794118" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.381308 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hw8rb"] Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.383910 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.386893 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.387743 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw8rb"] Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.462556 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56lrj\" (UniqueName: \"kubernetes.io/projected/c6e1d685-7223-41bd-b180-ee357d754e89-kube-api-access-56lrj\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.462623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-catalog-content\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.462718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-utilities\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.532621 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:30 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:30 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:30 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.532689 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.534814 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.534872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.548406 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.564556 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-catalog-content\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.564722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-utilities\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.564790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56lrj\" (UniqueName: \"kubernetes.io/projected/c6e1d685-7223-41bd-b180-ee357d754e89-kube-api-access-56lrj\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.565810 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-utilities\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.565864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-catalog-content\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.602067 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56lrj\" (UniqueName: \"kubernetes.io/projected/c6e1d685-7223-41bd-b180-ee357d754e89-kube-api-access-56lrj\") pod \"redhat-marketplace-hw8rb\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.706170 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.760458 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cq9l container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.760520 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8cq9l" podUID="ceffc235-a012-44b5-91ff-f45a19502453" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.760681 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-8cq9l container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.760747 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8cq9l" podUID="ceffc235-a012-44b5-91ff-f45a19502453" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.770499 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8dwzp"] Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.772047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.800058 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dwzp"] Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.871867 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-utilities\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.871913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-catalog-content\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.871984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkx9b\" (UniqueName: \"kubernetes.io/projected/a177ce1b-07ee-4839-8725-539c031f9610-kube-api-access-pkx9b\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.950229 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw8rb"] Nov 29 01:13:30 crc kubenswrapper[4749]: W1129 01:13:30.962621 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6e1d685_7223_41bd_b180_ee357d754e89.slice/crio-23282cc5f3bfd955a34bfbb16401a64391413033269811f6470c93fbf695c7ea WatchSource:0}: Error finding container 23282cc5f3bfd955a34bfbb16401a64391413033269811f6470c93fbf695c7ea: Status 404 returned error can't find the container with id 23282cc5f3bfd955a34bfbb16401a64391413033269811f6470c93fbf695c7ea Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.973158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-utilities\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.973214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-catalog-content\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.973267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkx9b\" (UniqueName: \"kubernetes.io/projected/a177ce1b-07ee-4839-8725-539c031f9610-kube-api-access-pkx9b\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.975469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-catalog-content\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.975655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-utilities\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:30 crc kubenswrapper[4749]: I1129 01:13:30.992254 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkx9b\" (UniqueName: \"kubernetes.io/projected/a177ce1b-07ee-4839-8725-539c031f9610-kube-api-access-pkx9b\") pod \"redhat-marketplace-8dwzp\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.008467 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerID="8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb" exitCode=0 Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.008542 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh54f" event={"ID":"f9c1388f-39e4-470f-a359-46e05c3963b0","Type":"ContainerDied","Data":"8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb"} Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.015260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw8rb" event={"ID":"c6e1d685-7223-41bd-b180-ee357d754e89","Type":"ContainerStarted","Data":"23282cc5f3bfd955a34bfbb16401a64391413033269811f6470c93fbf695c7ea"} Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.020634 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wl6jq" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.112597 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.337295 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.337454 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.342556 4749 patch_prober.go:28] interesting pod/console-f9d7485db-klx5g container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.342703 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-klx5g" podUID="efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.343175 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.343244 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.352809 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.419309 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.473370 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dwzp"] Nov 29 01:13:31 crc kubenswrapper[4749]: W1129 01:13:31.479616 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda177ce1b_07ee_4839_8725_539c031f9610.slice/crio-5afa079442e1237fa5ca4a164127bfbea4d626461d4657156221f47ecdc06217 WatchSource:0}: Error finding container 5afa079442e1237fa5ca4a164127bfbea4d626461d4657156221f47ecdc06217: Status 404 returned error can't find the container with id 5afa079442e1237fa5ca4a164127bfbea4d626461d4657156221f47ecdc06217 Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.531451 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.537592 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:31 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:31 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:31 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.537677 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.585601 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c2rn4"] Nov 29 01:13:31 crc kubenswrapper[4749]: E1129 01:13:31.587113 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf44185-40f9-4d15-93a0-13604622e06f" containerName="collect-profiles" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.587141 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf44185-40f9-4d15-93a0-13604622e06f" containerName="collect-profiles" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.587519 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf44185-40f9-4d15-93a0-13604622e06f" containerName="collect-profiles" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.592207 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.598750 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cf44185-40f9-4d15-93a0-13604622e06f-config-volume\") pod \"2cf44185-40f9-4d15-93a0-13604622e06f\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.598905 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttvj7\" (UniqueName: \"kubernetes.io/projected/2cf44185-40f9-4d15-93a0-13604622e06f-kube-api-access-ttvj7\") pod \"2cf44185-40f9-4d15-93a0-13604622e06f\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.599053 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cf44185-40f9-4d15-93a0-13604622e06f-secret-volume\") pod \"2cf44185-40f9-4d15-93a0-13604622e06f\" (UID: \"2cf44185-40f9-4d15-93a0-13604622e06f\") " Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.600884 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.601665 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cf44185-40f9-4d15-93a0-13604622e06f-config-volume" (OuterVolumeSpecName: "config-volume") pod "2cf44185-40f9-4d15-93a0-13604622e06f" (UID: "2cf44185-40f9-4d15-93a0-13604622e06f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.608632 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf44185-40f9-4d15-93a0-13604622e06f-kube-api-access-ttvj7" (OuterVolumeSpecName: "kube-api-access-ttvj7") pod "2cf44185-40f9-4d15-93a0-13604622e06f" (UID: "2cf44185-40f9-4d15-93a0-13604622e06f"). InnerVolumeSpecName "kube-api-access-ttvj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.614346 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2rn4"] Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.614610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf44185-40f9-4d15-93a0-13604622e06f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2cf44185-40f9-4d15-93a0-13604622e06f" (UID: "2cf44185-40f9-4d15-93a0-13604622e06f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.703500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-catalog-content\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.703738 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz6v4\" (UniqueName: \"kubernetes.io/projected/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-kube-api-access-bz6v4\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.703972 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-utilities\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.704059 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cf44185-40f9-4d15-93a0-13604622e06f-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.704071 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttvj7\" (UniqueName: \"kubernetes.io/projected/2cf44185-40f9-4d15-93a0-13604622e06f-kube-api-access-ttvj7\") on node \"crc\" DevicePath \"\"" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.704084 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cf44185-40f9-4d15-93a0-13604622e06f-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.805782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-utilities\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.806295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-catalog-content\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.806323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz6v4\" (UniqueName: \"kubernetes.io/projected/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-kube-api-access-bz6v4\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.806406 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-utilities\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.806934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-catalog-content\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.830042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz6v4\" (UniqueName: \"kubernetes.io/projected/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-kube-api-access-bz6v4\") pod \"redhat-operators-c2rn4\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.923693 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.971277 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2tb8v"] Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.973163 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:31 crc kubenswrapper[4749]: I1129 01:13:31.986083 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2tb8v"] Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.009515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-catalog-content\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.009562 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-utilities\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.009630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8rm\" (UniqueName: \"kubernetes.io/projected/305e4c45-d936-45c8-ac76-1b85aa52eb08-kube-api-access-mc8rm\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.104431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" event={"ID":"2cf44185-40f9-4d15-93a0-13604622e06f","Type":"ContainerDied","Data":"3ccad68fc65e3aa03bc6943c421b50a0adb8c22719504ac98a0dbad1d794b5cf"} Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.104513 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ccad68fc65e3aa03bc6943c421b50a0adb8c22719504ac98a0dbad1d794b5cf" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.104637 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.112457 4749 generic.go:334] "Generic (PLEG): container finished" podID="c6e1d685-7223-41bd-b180-ee357d754e89" containerID="953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1" exitCode=0 Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.112993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw8rb" event={"ID":"c6e1d685-7223-41bd-b180-ee357d754e89","Type":"ContainerDied","Data":"953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1"} Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.114870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-catalog-content\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.114945 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-utilities\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.115090 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc8rm\" (UniqueName: \"kubernetes.io/projected/305e4c45-d936-45c8-ac76-1b85aa52eb08-kube-api-access-mc8rm\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.115927 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-catalog-content\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.116260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-utilities\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.134127 4749 generic.go:334] "Generic (PLEG): container finished" podID="a177ce1b-07ee-4839-8725-539c031f9610" containerID="1f76515d1a2c2147dd9b0add8b485b80ada61d380065fbe20fe0f08ad769fa36" exitCode=0 Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.136489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dwzp" event={"ID":"a177ce1b-07ee-4839-8725-539c031f9610","Type":"ContainerDied","Data":"1f76515d1a2c2147dd9b0add8b485b80ada61d380065fbe20fe0f08ad769fa36"} Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.136518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dwzp" event={"ID":"a177ce1b-07ee-4839-8725-539c031f9610","Type":"ContainerStarted","Data":"5afa079442e1237fa5ca4a164127bfbea4d626461d4657156221f47ecdc06217"} Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.147468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dhpfn" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.156048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc8rm\" (UniqueName: \"kubernetes.io/projected/305e4c45-d936-45c8-ac76-1b85aa52eb08-kube-api-access-mc8rm\") pod \"redhat-operators-2tb8v\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.307714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.479304 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2rn4"] Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.534482 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:32 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:32 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:32 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.534567 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.668795 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.671567 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.674852 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.674921 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.683594 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.836965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c913a306-b115-4d50-ac16-51ff813db661-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c913a306-b115-4d50-ac16-51ff813db661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.837856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c913a306-b115-4d50-ac16-51ff813db661-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c913a306-b115-4d50-ac16-51ff813db661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.899869 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2tb8v"] Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.941444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c913a306-b115-4d50-ac16-51ff813db661-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c913a306-b115-4d50-ac16-51ff813db661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.941537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c913a306-b115-4d50-ac16-51ff813db661-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c913a306-b115-4d50-ac16-51ff813db661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.941639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c913a306-b115-4d50-ac16-51ff813db661-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c913a306-b115-4d50-ac16-51ff813db661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.972820 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c913a306-b115-4d50-ac16-51ff813db661-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c913a306-b115-4d50-ac16-51ff813db661\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:32 crc kubenswrapper[4749]: I1129 01:13:32.997307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.196229 4749 generic.go:334] "Generic (PLEG): container finished" podID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerID="50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51" exitCode=0 Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.196455 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rn4" event={"ID":"deff97e4-7feb-44d1-8f74-b1b5bd302b9e","Type":"ContainerDied","Data":"50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51"} Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.197379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rn4" event={"ID":"deff97e4-7feb-44d1-8f74-b1b5bd302b9e","Type":"ContainerStarted","Data":"690343b879f624738b826f7d5e5c9e3881d624e68fa34fc2a6fe0d402f87c3d2"} Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.202956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tb8v" event={"ID":"305e4c45-d936-45c8-ac76-1b85aa52eb08","Type":"ContainerStarted","Data":"376c5be0d182a3de4aa2d0537342d2a15c043427e0948228674308c10f40b36e"} Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.374974 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.375785 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.380306 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.380483 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.385836 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.453225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b34e7eb-0574-46fb-9a2e-1678613f6933-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0b34e7eb-0574-46fb-9a2e-1678613f6933\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.453296 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b34e7eb-0574-46fb-9a2e-1678613f6933-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0b34e7eb-0574-46fb-9a2e-1678613f6933\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.542351 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:33 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:33 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:33 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.542430 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.554590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b34e7eb-0574-46fb-9a2e-1678613f6933-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0b34e7eb-0574-46fb-9a2e-1678613f6933\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.554824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b34e7eb-0574-46fb-9a2e-1678613f6933-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0b34e7eb-0574-46fb-9a2e-1678613f6933\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.554915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b34e7eb-0574-46fb-9a2e-1678613f6933-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0b34e7eb-0574-46fb-9a2e-1678613f6933\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.587395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b34e7eb-0574-46fb-9a2e-1678613f6933-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0b34e7eb-0574-46fb-9a2e-1678613f6933\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.766881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:33 crc kubenswrapper[4749]: I1129 01:13:33.842426 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 29 01:13:33 crc kubenswrapper[4749]: W1129 01:13:33.852617 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc913a306_b115_4d50_ac16_51ff813db661.slice/crio-e0d04af78a2a11dceed3ccbd2a6f86f83abad9ddaef2b47583f04e655d4dd120 WatchSource:0}: Error finding container e0d04af78a2a11dceed3ccbd2a6f86f83abad9ddaef2b47583f04e655d4dd120: Status 404 returned error can't find the container with id e0d04af78a2a11dceed3ccbd2a6f86f83abad9ddaef2b47583f04e655d4dd120 Nov 29 01:13:34 crc kubenswrapper[4749]: I1129 01:13:34.086487 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 29 01:13:34 crc kubenswrapper[4749]: I1129 01:13:34.248639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0b34e7eb-0574-46fb-9a2e-1678613f6933","Type":"ContainerStarted","Data":"14549e4c9617bb3bd31aef5b1c1b80222e37b3c9a6f73885b00bd0005ebf2114"} Nov 29 01:13:34 crc kubenswrapper[4749]: I1129 01:13:34.255165 4749 generic.go:334] "Generic (PLEG): container finished" podID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerID="e3045091df96f05c99a1eb41c14e7003b425fce365a61bf14cb04a3fc057e005" exitCode=0 Nov 29 01:13:34 crc kubenswrapper[4749]: I1129 01:13:34.255233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tb8v" event={"ID":"305e4c45-d936-45c8-ac76-1b85aa52eb08","Type":"ContainerDied","Data":"e3045091df96f05c99a1eb41c14e7003b425fce365a61bf14cb04a3fc057e005"} Nov 29 01:13:34 crc kubenswrapper[4749]: I1129 01:13:34.261581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c913a306-b115-4d50-ac16-51ff813db661","Type":"ContainerStarted","Data":"e0d04af78a2a11dceed3ccbd2a6f86f83abad9ddaef2b47583f04e655d4dd120"} Nov 29 01:13:34 crc kubenswrapper[4749]: I1129 01:13:34.534435 4749 patch_prober.go:28] interesting pod/router-default-5444994796-5t7v2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 29 01:13:34 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Nov 29 01:13:34 crc kubenswrapper[4749]: [+]process-running ok Nov 29 01:13:34 crc kubenswrapper[4749]: healthz check failed Nov 29 01:13:34 crc kubenswrapper[4749]: I1129 01:13:34.534525 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5t7v2" podUID="f1015f6f-5ba5-472f-8a50-4b16bfdbf016" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 29 01:13:35 crc kubenswrapper[4749]: I1129 01:13:35.281624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c913a306-b115-4d50-ac16-51ff813db661","Type":"ContainerStarted","Data":"c1903c1670c3c83bb4dfc48f9c81be7bd4cdc8b011ef0e69490e3ef139587854"} Nov 29 01:13:35 crc kubenswrapper[4749]: I1129 01:13:35.290068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0b34e7eb-0574-46fb-9a2e-1678613f6933","Type":"ContainerStarted","Data":"97e22af0bfd8c3475bb391a3851036353bb076dd2929b6cc4144ff1d54c46a22"} Nov 29 01:13:35 crc kubenswrapper[4749]: I1129 01:13:35.307274 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.307251842 podStartE2EDuration="3.307251842s" podCreationTimestamp="2025-11-29 01:13:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:35.298756546 +0000 UTC m=+158.470906423" watchObservedRunningTime="2025-11-29 01:13:35.307251842 +0000 UTC m=+158.479401699" Nov 29 01:13:35 crc kubenswrapper[4749]: I1129 01:13:35.322122 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.322096258 podStartE2EDuration="2.322096258s" podCreationTimestamp="2025-11-29 01:13:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:13:35.31883237 +0000 UTC m=+158.490982247" watchObservedRunningTime="2025-11-29 01:13:35.322096258 +0000 UTC m=+158.494246115" Nov 29 01:13:35 crc kubenswrapper[4749]: I1129 01:13:35.544323 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:35 crc kubenswrapper[4749]: I1129 01:13:35.550409 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5t7v2" Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.345570 4749 generic.go:334] "Generic (PLEG): container finished" podID="c913a306-b115-4d50-ac16-51ff813db661" containerID="c1903c1670c3c83bb4dfc48f9c81be7bd4cdc8b011ef0e69490e3ef139587854" exitCode=0 Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.345656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c913a306-b115-4d50-ac16-51ff813db661","Type":"ContainerDied","Data":"c1903c1670c3c83bb4dfc48f9c81be7bd4cdc8b011ef0e69490e3ef139587854"} Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.365333 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-kg6tt_18a54704-5d0d-4caa-b347-2751a525a666/cluster-samples-operator/0.log" Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.365397 4749 generic.go:334] "Generic (PLEG): container finished" podID="18a54704-5d0d-4caa-b347-2751a525a666" containerID="29379f81ce13c0c793f2d98b8585314936b8621dda2b85721f1ef40500648f9e" exitCode=2 Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.365530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" event={"ID":"18a54704-5d0d-4caa-b347-2751a525a666","Type":"ContainerDied","Data":"29379f81ce13c0c793f2d98b8585314936b8621dda2b85721f1ef40500648f9e"} Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.366501 4749 scope.go:117] "RemoveContainer" containerID="29379f81ce13c0c793f2d98b8585314936b8621dda2b85721f1ef40500648f9e" Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.379021 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b34e7eb-0574-46fb-9a2e-1678613f6933" containerID="97e22af0bfd8c3475bb391a3851036353bb076dd2929b6cc4144ff1d54c46a22" exitCode=0 Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.379480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0b34e7eb-0574-46fb-9a2e-1678613f6933","Type":"ContainerDied","Data":"97e22af0bfd8c3475bb391a3851036353bb076dd2929b6cc4144ff1d54c46a22"} Nov 29 01:13:36 crc kubenswrapper[4749]: I1129 01:13:36.613342 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jgbtg" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.418016 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-kg6tt_18a54704-5d0d-4caa-b347-2751a525a666/cluster-samples-operator/0.log" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.418153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kg6tt" event={"ID":"18a54704-5d0d-4caa-b347-2751a525a666","Type":"ContainerStarted","Data":"80f8026ac4426d7187b23c264d028bb0d2a1cef7c9368068e2d9693cd7b18d5a"} Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.703372 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.788540 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.865866 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c913a306-b115-4d50-ac16-51ff813db661-kubelet-dir\") pod \"c913a306-b115-4d50-ac16-51ff813db661\" (UID: \"c913a306-b115-4d50-ac16-51ff813db661\") " Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.865958 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c913a306-b115-4d50-ac16-51ff813db661-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c913a306-b115-4d50-ac16-51ff813db661" (UID: "c913a306-b115-4d50-ac16-51ff813db661"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.866160 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c913a306-b115-4d50-ac16-51ff813db661-kube-api-access\") pod \"c913a306-b115-4d50-ac16-51ff813db661\" (UID: \"c913a306-b115-4d50-ac16-51ff813db661\") " Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.866549 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c913a306-b115-4d50-ac16-51ff813db661-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.876615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c913a306-b115-4d50-ac16-51ff813db661-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c913a306-b115-4d50-ac16-51ff813db661" (UID: "c913a306-b115-4d50-ac16-51ff813db661"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.967916 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b34e7eb-0574-46fb-9a2e-1678613f6933-kubelet-dir\") pod \"0b34e7eb-0574-46fb-9a2e-1678613f6933\" (UID: \"0b34e7eb-0574-46fb-9a2e-1678613f6933\") " Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.968047 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b34e7eb-0574-46fb-9a2e-1678613f6933-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b34e7eb-0574-46fb-9a2e-1678613f6933" (UID: "0b34e7eb-0574-46fb-9a2e-1678613f6933"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.968177 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b34e7eb-0574-46fb-9a2e-1678613f6933-kube-api-access\") pod \"0b34e7eb-0574-46fb-9a2e-1678613f6933\" (UID: \"0b34e7eb-0574-46fb-9a2e-1678613f6933\") " Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.968567 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b34e7eb-0574-46fb-9a2e-1678613f6933-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.968591 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c913a306-b115-4d50-ac16-51ff813db661-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 01:13:37 crc kubenswrapper[4749]: I1129 01:13:37.972967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b34e7eb-0574-46fb-9a2e-1678613f6933-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b34e7eb-0574-46fb-9a2e-1678613f6933" (UID: "0b34e7eb-0574-46fb-9a2e-1678613f6933"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:13:38 crc kubenswrapper[4749]: I1129 01:13:38.071580 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b34e7eb-0574-46fb-9a2e-1678613f6933-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 01:13:38 crc kubenswrapper[4749]: I1129 01:13:38.433328 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 29 01:13:38 crc kubenswrapper[4749]: I1129 01:13:38.433371 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0b34e7eb-0574-46fb-9a2e-1678613f6933","Type":"ContainerDied","Data":"14549e4c9617bb3bd31aef5b1c1b80222e37b3c9a6f73885b00bd0005ebf2114"} Nov 29 01:13:38 crc kubenswrapper[4749]: I1129 01:13:38.433461 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14549e4c9617bb3bd31aef5b1c1b80222e37b3c9a6f73885b00bd0005ebf2114" Nov 29 01:13:38 crc kubenswrapper[4749]: I1129 01:13:38.438670 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 29 01:13:38 crc kubenswrapper[4749]: I1129 01:13:38.445165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c913a306-b115-4d50-ac16-51ff813db661","Type":"ContainerDied","Data":"e0d04af78a2a11dceed3ccbd2a6f86f83abad9ddaef2b47583f04e655d4dd120"} Nov 29 01:13:38 crc kubenswrapper[4749]: I1129 01:13:38.445239 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0d04af78a2a11dceed3ccbd2a6f86f83abad9ddaef2b47583f04e655d4dd120" Nov 29 01:13:40 crc kubenswrapper[4749]: I1129 01:13:40.765671 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8cq9l" Nov 29 01:13:41 crc kubenswrapper[4749]: I1129 01:13:41.358352 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:41 crc kubenswrapper[4749]: I1129 01:13:41.362015 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:13:42 crc kubenswrapper[4749]: I1129 01:13:42.787053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:42 crc kubenswrapper[4749]: I1129 01:13:42.811826 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2bba1226-0e27-4cea-9eaa-d653f2061ec1-metrics-certs\") pod \"network-metrics-daemon-nczdn\" (UID: \"2bba1226-0e27-4cea-9eaa-d653f2061ec1\") " pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:42 crc kubenswrapper[4749]: I1129 01:13:42.897008 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nczdn" Nov 29 01:13:48 crc kubenswrapper[4749]: I1129 01:13:48.858672 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:13:55 crc kubenswrapper[4749]: I1129 01:13:55.374682 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:13:55 crc kubenswrapper[4749]: I1129 01:13:55.375442 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:14:01 crc kubenswrapper[4749]: E1129 01:14:01.140756 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 29 01:14:01 crc kubenswrapper[4749]: E1129 01:14:01.141390 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9v9zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bj6rw_openshift-marketplace(7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 01:14:01 crc kubenswrapper[4749]: E1129 01:14:01.142761 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bj6rw" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" Nov 29 01:14:01 crc kubenswrapper[4749]: E1129 01:14:01.166590 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 29 01:14:01 crc kubenswrapper[4749]: E1129 01:14:01.166766 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6cj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qr8r9_openshift-marketplace(1dd82406-f875-4ec7-bbe9-8424b2725f51): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 29 01:14:01 crc kubenswrapper[4749]: E1129 01:14:01.168463 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qr8r9" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.524895 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nczdn"] Nov 29 01:14:01 crc kubenswrapper[4749]: W1129 01:14:01.534779 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bba1226_0e27_4cea_9eaa_d653f2061ec1.slice/crio-264b3dbf3b46bd53e0a714c52417ec8c304b316c88cbf9ac65a3d45dd45afa37 WatchSource:0}: Error finding container 264b3dbf3b46bd53e0a714c52417ec8c304b316c88cbf9ac65a3d45dd45afa37: Status 404 returned error can't find the container with id 264b3dbf3b46bd53e0a714c52417ec8c304b316c88cbf9ac65a3d45dd45afa37 Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.629898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tb8v" event={"ID":"305e4c45-d936-45c8-ac76-1b85aa52eb08","Type":"ContainerStarted","Data":"7d93c5bcc8810d5ad8222fbfddf0366fb3a65c3469f66f49303c951b177d882e"} Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.638351 4749 generic.go:334] "Generic (PLEG): container finished" podID="a177ce1b-07ee-4839-8725-539c031f9610" containerID="4d0b627ca71aee74963d0e76c36b2f0a1a84e6126e4f9b8b0209b16536a76096" exitCode=0 Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.638511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dwzp" event={"ID":"a177ce1b-07ee-4839-8725-539c031f9610","Type":"ContainerDied","Data":"4d0b627ca71aee74963d0e76c36b2f0a1a84e6126e4f9b8b0209b16536a76096"} Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.640276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nczdn" event={"ID":"2bba1226-0e27-4cea-9eaa-d653f2061ec1","Type":"ContainerStarted","Data":"264b3dbf3b46bd53e0a714c52417ec8c304b316c88cbf9ac65a3d45dd45afa37"} Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.643171 4749 generic.go:334] "Generic (PLEG): container finished" podID="c6e1d685-7223-41bd-b180-ee357d754e89" containerID="9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b" exitCode=0 Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.643244 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw8rb" event={"ID":"c6e1d685-7223-41bd-b180-ee357d754e89","Type":"ContainerDied","Data":"9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b"} Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.657491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwk9" event={"ID":"574f9eb6-8ce6-4f90-8e38-47be16ec96d1","Type":"ContainerStarted","Data":"e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272"} Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.661285 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rn4" event={"ID":"deff97e4-7feb-44d1-8f74-b1b5bd302b9e","Type":"ContainerStarted","Data":"672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56"} Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.667605 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerID="ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45" exitCode=0 Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.668096 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh54f" event={"ID":"f9c1388f-39e4-470f-a359-46e05c3963b0","Type":"ContainerDied","Data":"ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45"} Nov 29 01:14:01 crc kubenswrapper[4749]: E1129 01:14:01.670827 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bj6rw" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" Nov 29 01:14:01 crc kubenswrapper[4749]: E1129 01:14:01.672025 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qr8r9" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" Nov 29 01:14:01 crc kubenswrapper[4749]: I1129 01:14:01.855730 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" Nov 29 01:14:02 crc kubenswrapper[4749]: I1129 01:14:02.676980 4749 generic.go:334] "Generic (PLEG): container finished" podID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerID="672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56" exitCode=0 Nov 29 01:14:02 crc kubenswrapper[4749]: I1129 01:14:02.677228 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rn4" event={"ID":"deff97e4-7feb-44d1-8f74-b1b5bd302b9e","Type":"ContainerDied","Data":"672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56"} Nov 29 01:14:02 crc kubenswrapper[4749]: I1129 01:14:02.686529 4749 generic.go:334] "Generic (PLEG): container finished" podID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerID="7d93c5bcc8810d5ad8222fbfddf0366fb3a65c3469f66f49303c951b177d882e" exitCode=0 Nov 29 01:14:02 crc kubenswrapper[4749]: I1129 01:14:02.686615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tb8v" event={"ID":"305e4c45-d936-45c8-ac76-1b85aa52eb08","Type":"ContainerDied","Data":"7d93c5bcc8810d5ad8222fbfddf0366fb3a65c3469f66f49303c951b177d882e"} Nov 29 01:14:02 crc kubenswrapper[4749]: I1129 01:14:02.688583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nczdn" event={"ID":"2bba1226-0e27-4cea-9eaa-d653f2061ec1","Type":"ContainerStarted","Data":"a6ebed56efcce5d9a2fb107dee6138288cb0fad2ce264e6daecadeef780e1144"} Nov 29 01:14:02 crc kubenswrapper[4749]: I1129 01:14:02.692640 4749 generic.go:334] "Generic (PLEG): container finished" podID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerID="e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272" exitCode=0 Nov 29 01:14:02 crc kubenswrapper[4749]: I1129 01:14:02.692709 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwk9" event={"ID":"574f9eb6-8ce6-4f90-8e38-47be16ec96d1","Type":"ContainerDied","Data":"e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272"} Nov 29 01:14:03 crc kubenswrapper[4749]: I1129 01:14:03.343471 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 29 01:14:03 crc kubenswrapper[4749]: I1129 01:14:03.699078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nczdn" event={"ID":"2bba1226-0e27-4cea-9eaa-d653f2061ec1","Type":"ContainerStarted","Data":"6539016e6c633b848dc3d80ceef2a0cb9978e3771efb0e4b3f00fc65a1a4504d"} Nov 29 01:14:03 crc kubenswrapper[4749]: I1129 01:14:03.719261 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nczdn" podStartSLOduration=163.719237305 podStartE2EDuration="2m43.719237305s" podCreationTimestamp="2025-11-29 01:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:14:03.716657948 +0000 UTC m=+186.888807825" watchObservedRunningTime="2025-11-29 01:14:03.719237305 +0000 UTC m=+186.891387162" Nov 29 01:14:06 crc kubenswrapper[4749]: I1129 01:14:06.721615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dwzp" event={"ID":"a177ce1b-07ee-4839-8725-539c031f9610","Type":"ContainerStarted","Data":"b4b90ebfb5dd08ad8561f59686adb580bb05c8760c3be71aaac90db4223baf2d"} Nov 29 01:14:06 crc kubenswrapper[4749]: I1129 01:14:06.743671 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8dwzp" podStartSLOduration=4.124857854 podStartE2EDuration="36.74364684s" podCreationTimestamp="2025-11-29 01:13:30 +0000 UTC" firstStartedPulling="2025-11-29 01:13:32.140520388 +0000 UTC m=+155.312670245" lastFinishedPulling="2025-11-29 01:14:04.759309374 +0000 UTC m=+187.931459231" observedRunningTime="2025-11-29 01:14:06.74265805 +0000 UTC m=+189.914807907" watchObservedRunningTime="2025-11-29 01:14:06.74364684 +0000 UTC m=+189.915796707" Nov 29 01:14:08 crc kubenswrapper[4749]: I1129 01:14:08.750468 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw8rb" event={"ID":"c6e1d685-7223-41bd-b180-ee357d754e89","Type":"ContainerStarted","Data":"a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e"} Nov 29 01:14:08 crc kubenswrapper[4749]: I1129 01:14:08.789383 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hw8rb" podStartSLOduration=3.540988333 podStartE2EDuration="38.789325482s" podCreationTimestamp="2025-11-29 01:13:30 +0000 UTC" firstStartedPulling="2025-11-29 01:13:32.140378504 +0000 UTC m=+155.312528361" lastFinishedPulling="2025-11-29 01:14:07.388715613 +0000 UTC m=+190.560865510" observedRunningTime="2025-11-29 01:14:08.779108504 +0000 UTC m=+191.951258361" watchObservedRunningTime="2025-11-29 01:14:08.789325482 +0000 UTC m=+191.961475359" Nov 29 01:14:09 crc kubenswrapper[4749]: I1129 01:14:09.825619 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwk9" event={"ID":"574f9eb6-8ce6-4f90-8e38-47be16ec96d1","Type":"ContainerStarted","Data":"50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad"} Nov 29 01:14:09 crc kubenswrapper[4749]: I1129 01:14:09.832727 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rn4" event={"ID":"deff97e4-7feb-44d1-8f74-b1b5bd302b9e","Type":"ContainerStarted","Data":"1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958"} Nov 29 01:14:09 crc kubenswrapper[4749]: I1129 01:14:09.838565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh54f" event={"ID":"f9c1388f-39e4-470f-a359-46e05c3963b0","Type":"ContainerStarted","Data":"d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d"} Nov 29 01:14:09 crc kubenswrapper[4749]: I1129 01:14:09.861811 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5rwk9" podStartSLOduration=3.319630579 podStartE2EDuration="41.861779534s" podCreationTimestamp="2025-11-29 01:13:28 +0000 UTC" firstStartedPulling="2025-11-29 01:13:29.979916692 +0000 UTC m=+153.152066549" lastFinishedPulling="2025-11-29 01:14:08.522065647 +0000 UTC m=+191.694215504" observedRunningTime="2025-11-29 01:14:09.861236027 +0000 UTC m=+193.033385894" watchObservedRunningTime="2025-11-29 01:14:09.861779534 +0000 UTC m=+193.033929391" Nov 29 01:14:09 crc kubenswrapper[4749]: I1129 01:14:09.867949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tb8v" event={"ID":"305e4c45-d936-45c8-ac76-1b85aa52eb08","Type":"ContainerStarted","Data":"04a66febe7afa72a4ef08bcfc439bb2adc23807b554336955e4156a65961b9b0"} Nov 29 01:14:09 crc kubenswrapper[4749]: I1129 01:14:09.882547 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mh54f" podStartSLOduration=4.446251979 podStartE2EDuration="41.882496687s" podCreationTimestamp="2025-11-29 01:13:28 +0000 UTC" firstStartedPulling="2025-11-29 01:13:31.010880926 +0000 UTC m=+154.183030793" lastFinishedPulling="2025-11-29 01:14:08.447125644 +0000 UTC m=+191.619275501" observedRunningTime="2025-11-29 01:14:09.879795715 +0000 UTC m=+193.051945582" watchObservedRunningTime="2025-11-29 01:14:09.882496687 +0000 UTC m=+193.054646544" Nov 29 01:14:09 crc kubenswrapper[4749]: I1129 01:14:09.898740 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c2rn4" podStartSLOduration=3.646419815 podStartE2EDuration="38.898709874s" podCreationTimestamp="2025-11-29 01:13:31 +0000 UTC" firstStartedPulling="2025-11-29 01:13:33.199859846 +0000 UTC m=+156.372009703" lastFinishedPulling="2025-11-29 01:14:08.452149905 +0000 UTC m=+191.624299762" observedRunningTime="2025-11-29 01:14:09.896112546 +0000 UTC m=+193.068262403" watchObservedRunningTime="2025-11-29 01:14:09.898709874 +0000 UTC m=+193.070859721" Nov 29 01:14:09 crc kubenswrapper[4749]: I1129 01:14:09.934366 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2tb8v" podStartSLOduration=4.558876436 podStartE2EDuration="38.934330484s" podCreationTimestamp="2025-11-29 01:13:31 +0000 UTC" firstStartedPulling="2025-11-29 01:13:34.261559724 +0000 UTC m=+157.433709581" lastFinishedPulling="2025-11-29 01:14:08.637013772 +0000 UTC m=+191.809163629" observedRunningTime="2025-11-29 01:14:09.927307654 +0000 UTC m=+193.099457521" watchObservedRunningTime="2025-11-29 01:14:09.934330484 +0000 UTC m=+193.106480341" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.119435 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 01:14:10 crc kubenswrapper[4749]: E1129 01:14:10.119829 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b34e7eb-0574-46fb-9a2e-1678613f6933" containerName="pruner" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.119848 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b34e7eb-0574-46fb-9a2e-1678613f6933" containerName="pruner" Nov 29 01:14:10 crc kubenswrapper[4749]: E1129 01:14:10.119898 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c913a306-b115-4d50-ac16-51ff813db661" containerName="pruner" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.119909 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c913a306-b115-4d50-ac16-51ff813db661" containerName="pruner" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.120069 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c913a306-b115-4d50-ac16-51ff813db661" containerName="pruner" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.120095 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b34e7eb-0574-46fb-9a2e-1678613f6933" containerName="pruner" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.120837 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.125298 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.125723 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.129529 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.165444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.165506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.266915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.266986 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.267510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.293965 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.438460 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.655092 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.706948 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:14:10 crc kubenswrapper[4749]: I1129 01:14:10.707009 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:14:11 crc kubenswrapper[4749]: I1129 01:14:11.113455 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:14:11 crc kubenswrapper[4749]: I1129 01:14:11.113573 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:14:11 crc kubenswrapper[4749]: I1129 01:14:11.336187 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:14:11 crc kubenswrapper[4749]: I1129 01:14:11.343676 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:14:11 crc kubenswrapper[4749]: I1129 01:14:11.883332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"99aba3da-2e2b-43d2-ae9b-cc8df62f3108","Type":"ContainerStarted","Data":"da02565ecebb50fce51b742336d367b7703e4f90eb16dabd2c368117e1579af5"} Nov 29 01:14:11 crc kubenswrapper[4749]: I1129 01:14:11.924303 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:14:11 crc kubenswrapper[4749]: I1129 01:14:11.924384 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:14:11 crc kubenswrapper[4749]: I1129 01:14:11.931225 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:14:12 crc kubenswrapper[4749]: I1129 01:14:12.308412 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:14:12 crc kubenswrapper[4749]: I1129 01:14:12.308468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:14:12 crc kubenswrapper[4749]: I1129 01:14:12.894012 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"99aba3da-2e2b-43d2-ae9b-cc8df62f3108","Type":"ContainerStarted","Data":"09950d14228d1b741ae23d0c4af802882848cacf732e2be5fb7f7ae938d250d0"} Nov 29 01:14:12 crc kubenswrapper[4749]: I1129 01:14:12.916386 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.916350486 podStartE2EDuration="2.916350486s" podCreationTimestamp="2025-11-29 01:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:14:12.912664835 +0000 UTC m=+196.084814692" watchObservedRunningTime="2025-11-29 01:14:12.916350486 +0000 UTC m=+196.088500363" Nov 29 01:14:12 crc kubenswrapper[4749]: I1129 01:14:12.967923 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c2rn4" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="registry-server" probeResult="failure" output=< Nov 29 01:14:12 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 01:14:12 crc kubenswrapper[4749]: > Nov 29 01:14:13 crc kubenswrapper[4749]: I1129 01:14:13.390344 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2tb8v" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="registry-server" probeResult="failure" output=< Nov 29 01:14:13 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 01:14:13 crc kubenswrapper[4749]: > Nov 29 01:14:13 crc kubenswrapper[4749]: I1129 01:14:13.902719 4749 generic.go:334] "Generic (PLEG): container finished" podID="99aba3da-2e2b-43d2-ae9b-cc8df62f3108" containerID="09950d14228d1b741ae23d0c4af802882848cacf732e2be5fb7f7ae938d250d0" exitCode=0 Nov 29 01:14:13 crc kubenswrapper[4749]: I1129 01:14:13.902797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"99aba3da-2e2b-43d2-ae9b-cc8df62f3108","Type":"ContainerDied","Data":"09950d14228d1b741ae23d0c4af802882848cacf732e2be5fb7f7ae938d250d0"} Nov 29 01:14:13 crc kubenswrapper[4749]: I1129 01:14:13.982390 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dwzp"] Nov 29 01:14:13 crc kubenswrapper[4749]: I1129 01:14:13.982656 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8dwzp" podUID="a177ce1b-07ee-4839-8725-539c031f9610" containerName="registry-server" containerID="cri-o://b4b90ebfb5dd08ad8561f59686adb580bb05c8760c3be71aaac90db4223baf2d" gracePeriod=2 Nov 29 01:14:14 crc kubenswrapper[4749]: I1129 01:14:14.917529 4749 generic.go:334] "Generic (PLEG): container finished" podID="a177ce1b-07ee-4839-8725-539c031f9610" containerID="b4b90ebfb5dd08ad8561f59686adb580bb05c8760c3be71aaac90db4223baf2d" exitCode=0 Nov 29 01:14:14 crc kubenswrapper[4749]: I1129 01:14:14.917569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dwzp" event={"ID":"a177ce1b-07ee-4839-8725-539c031f9610","Type":"ContainerDied","Data":"b4b90ebfb5dd08ad8561f59686adb580bb05c8760c3be71aaac90db4223baf2d"} Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.398733 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.498191 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kube-api-access\") pod \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\" (UID: \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\") " Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.498313 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kubelet-dir\") pod \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\" (UID: \"99aba3da-2e2b-43d2-ae9b-cc8df62f3108\") " Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.498528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "99aba3da-2e2b-43d2-ae9b-cc8df62f3108" (UID: "99aba3da-2e2b-43d2-ae9b-cc8df62f3108"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.517654 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 01:14:17 crc kubenswrapper[4749]: E1129 01:14:17.517907 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99aba3da-2e2b-43d2-ae9b-cc8df62f3108" containerName="pruner" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.517922 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99aba3da-2e2b-43d2-ae9b-cc8df62f3108" containerName="pruner" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.518025 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99aba3da-2e2b-43d2-ae9b-cc8df62f3108" containerName="pruner" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.520499 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.527074 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "99aba3da-2e2b-43d2-ae9b-cc8df62f3108" (UID: "99aba3da-2e2b-43d2-ae9b-cc8df62f3108"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.537521 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.599423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ce71cce-5270-419e-a3c1-e153e1372586-kube-api-access\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.599472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.599548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-var-lock\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.599596 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.599609 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/99aba3da-2e2b-43d2-ae9b-cc8df62f3108-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.701056 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-var-lock\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.701127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ce71cce-5270-419e-a3c1-e153e1372586-kube-api-access\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.701155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.701262 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.701316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-var-lock\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.733548 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ce71cce-5270-419e-a3c1-e153e1372586-kube-api-access\") pod \"installer-9-crc\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.880112 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.943962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"99aba3da-2e2b-43d2-ae9b-cc8df62f3108","Type":"ContainerDied","Data":"da02565ecebb50fce51b742336d367b7703e4f90eb16dabd2c368117e1579af5"} Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.944030 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da02565ecebb50fce51b742336d367b7703e4f90eb16dabd2c368117e1579af5" Nov 29 01:14:17 crc kubenswrapper[4749]: I1129 01:14:17.944036 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.396479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.401873 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.519892 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-utilities\") pod \"a177ce1b-07ee-4839-8725-539c031f9610\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.521267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkx9b\" (UniqueName: \"kubernetes.io/projected/a177ce1b-07ee-4839-8725-539c031f9610-kube-api-access-pkx9b\") pod \"a177ce1b-07ee-4839-8725-539c031f9610\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.521349 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-catalog-content\") pod \"a177ce1b-07ee-4839-8725-539c031f9610\" (UID: \"a177ce1b-07ee-4839-8725-539c031f9610\") " Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.520957 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-utilities" (OuterVolumeSpecName: "utilities") pod "a177ce1b-07ee-4839-8725-539c031f9610" (UID: "a177ce1b-07ee-4839-8725-539c031f9610"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.522263 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.526716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a177ce1b-07ee-4839-8725-539c031f9610-kube-api-access-pkx9b" (OuterVolumeSpecName: "kube-api-access-pkx9b") pod "a177ce1b-07ee-4839-8725-539c031f9610" (UID: "a177ce1b-07ee-4839-8725-539c031f9610"). InnerVolumeSpecName "kube-api-access-pkx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.544227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a177ce1b-07ee-4839-8725-539c031f9610" (UID: "a177ce1b-07ee-4839-8725-539c031f9610"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.623793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkx9b\" (UniqueName: \"kubernetes.io/projected/a177ce1b-07ee-4839-8725-539c031f9610-kube-api-access-pkx9b\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.623828 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a177ce1b-07ee-4839-8725-539c031f9610-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.896562 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.896613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.944526 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.959008 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dwzp" event={"ID":"a177ce1b-07ee-4839-8725-539c031f9610","Type":"ContainerDied","Data":"5afa079442e1237fa5ca4a164127bfbea4d626461d4657156221f47ecdc06217"} Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.959058 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dwzp" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.959127 4749 scope.go:117] "RemoveContainer" containerID="b4b90ebfb5dd08ad8561f59686adb580bb05c8760c3be71aaac90db4223baf2d" Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.962903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6ce71cce-5270-419e-a3c1-e153e1372586","Type":"ContainerStarted","Data":"6fd3db2ce2c55b95d009a8a41a51da2b18735119edd3bbb954dcf20f1ca9c844"} Nov 29 01:14:18 crc kubenswrapper[4749]: I1129 01:14:18.985710 4749 scope.go:117] "RemoveContainer" containerID="4d0b627ca71aee74963d0e76c36b2f0a1a84e6126e4f9b8b0209b16536a76096" Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.012042 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.022512 4749 scope.go:117] "RemoveContainer" containerID="1f76515d1a2c2147dd9b0add8b485b80ada61d380065fbe20fe0f08ad769fa36" Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.025070 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dwzp"] Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.038482 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dwzp"] Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.089443 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a177ce1b-07ee-4839-8725-539c031f9610" path="/var/lib/kubelet/pods/a177ce1b-07ee-4839-8725-539c031f9610/volumes" Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.369261 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.369706 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.416583 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.975452 4749 generic.go:334] "Generic (PLEG): container finished" podID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerID="3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6" exitCode=0 Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.975515 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj6rw" event={"ID":"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5","Type":"ContainerDied","Data":"3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6"} Nov 29 01:14:19 crc kubenswrapper[4749]: I1129 01:14:19.983913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6ce71cce-5270-419e-a3c1-e153e1372586","Type":"ContainerStarted","Data":"012a9fc051a512fe5b20889032f04c285e22e357d630b536727b35b21293c3ea"} Nov 29 01:14:20 crc kubenswrapper[4749]: I1129 01:14:20.056729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:14:20 crc kubenswrapper[4749]: I1129 01:14:20.079363 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.079335238 podStartE2EDuration="3.079335238s" podCreationTimestamp="2025-11-29 01:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:14:20.033362511 +0000 UTC m=+203.205512368" watchObservedRunningTime="2025-11-29 01:14:20.079335238 +0000 UTC m=+203.251485095" Nov 29 01:14:20 crc kubenswrapper[4749]: I1129 01:14:20.756990 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:14:21 crc kubenswrapper[4749]: I1129 01:14:21.187272 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mh54f"] Nov 29 01:14:21 crc kubenswrapper[4749]: I1129 01:14:21.993040 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.020674 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj6rw" event={"ID":"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5","Type":"ContainerStarted","Data":"91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77"} Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.022716 4749 generic.go:334] "Generic (PLEG): container finished" podID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerID="a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a" exitCode=0 Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.022955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr8r9" event={"ID":"1dd82406-f875-4ec7-bbe9-8424b2725f51","Type":"ContainerDied","Data":"a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a"} Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.023007 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mh54f" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerName="registry-server" containerID="cri-o://d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d" gracePeriod=2 Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.044316 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bj6rw" podStartSLOduration=2.8151890550000003 podStartE2EDuration="54.044277271s" podCreationTimestamp="2025-11-29 01:13:28 +0000 UTC" firstStartedPulling="2025-11-29 01:13:29.971678545 +0000 UTC m=+153.143828442" lastFinishedPulling="2025-11-29 01:14:21.200766781 +0000 UTC m=+204.372916658" observedRunningTime="2025-11-29 01:14:22.043154069 +0000 UTC m=+205.215304006" watchObservedRunningTime="2025-11-29 01:14:22.044277271 +0000 UTC m=+205.216427168" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.044804 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.376404 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.411622 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.443112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.491285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p562h\" (UniqueName: \"kubernetes.io/projected/f9c1388f-39e4-470f-a359-46e05c3963b0-kube-api-access-p562h\") pod \"f9c1388f-39e4-470f-a359-46e05c3963b0\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.491409 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-utilities\") pod \"f9c1388f-39e4-470f-a359-46e05c3963b0\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.491491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-catalog-content\") pod \"f9c1388f-39e4-470f-a359-46e05c3963b0\" (UID: \"f9c1388f-39e4-470f-a359-46e05c3963b0\") " Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.492576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-utilities" (OuterVolumeSpecName: "utilities") pod "f9c1388f-39e4-470f-a359-46e05c3963b0" (UID: "f9c1388f-39e4-470f-a359-46e05c3963b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.499629 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c1388f-39e4-470f-a359-46e05c3963b0-kube-api-access-p562h" (OuterVolumeSpecName: "kube-api-access-p562h") pod "f9c1388f-39e4-470f-a359-46e05c3963b0" (UID: "f9c1388f-39e4-470f-a359-46e05c3963b0"). InnerVolumeSpecName "kube-api-access-p562h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.544216 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9c1388f-39e4-470f-a359-46e05c3963b0" (UID: "f9c1388f-39e4-470f-a359-46e05c3963b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.592917 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.592954 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c1388f-39e4-470f-a359-46e05c3963b0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:22 crc kubenswrapper[4749]: I1129 01:14:22.592967 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p562h\" (UniqueName: \"kubernetes.io/projected/f9c1388f-39e4-470f-a359-46e05c3963b0-kube-api-access-p562h\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.031918 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerID="d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d" exitCode=0 Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.032008 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mh54f" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.031995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh54f" event={"ID":"f9c1388f-39e4-470f-a359-46e05c3963b0","Type":"ContainerDied","Data":"d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d"} Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.034429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mh54f" event={"ID":"f9c1388f-39e4-470f-a359-46e05c3963b0","Type":"ContainerDied","Data":"28b654b4fe3027b0cca02972c9d9726e84520f60b55fa6efb2282fd50a7fecbb"} Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.034481 4749 scope.go:117] "RemoveContainer" containerID="d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.038333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr8r9" event={"ID":"1dd82406-f875-4ec7-bbe9-8424b2725f51","Type":"ContainerStarted","Data":"0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab"} Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.054821 4749 scope.go:117] "RemoveContainer" containerID="ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.065409 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qr8r9" podStartSLOduration=2.446204154 podStartE2EDuration="55.065379152s" podCreationTimestamp="2025-11-29 01:13:28 +0000 UTC" firstStartedPulling="2025-11-29 01:13:29.975047886 +0000 UTC m=+153.147197783" lastFinishedPulling="2025-11-29 01:14:22.594222894 +0000 UTC m=+205.766372781" observedRunningTime="2025-11-29 01:14:23.058642116 +0000 UTC m=+206.230791973" watchObservedRunningTime="2025-11-29 01:14:23.065379152 +0000 UTC m=+206.237529019" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.075216 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mh54f"] Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.080311 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mh54f"] Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.086951 4749 scope.go:117] "RemoveContainer" containerID="8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.089825 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" path="/var/lib/kubelet/pods/f9c1388f-39e4-470f-a359-46e05c3963b0/volumes" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.102300 4749 scope.go:117] "RemoveContainer" containerID="d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d" Nov 29 01:14:23 crc kubenswrapper[4749]: E1129 01:14:23.103761 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d\": container with ID starting with d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d not found: ID does not exist" containerID="d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.103801 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d"} err="failed to get container status \"d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d\": rpc error: code = NotFound desc = could not find container \"d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d\": container with ID starting with d548c8a2b980367b9d8d74813a38d2981a27dba8301d49a51457f0d244f7d21d not found: ID does not exist" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.103858 4749 scope.go:117] "RemoveContainer" containerID="ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45" Nov 29 01:14:23 crc kubenswrapper[4749]: E1129 01:14:23.104143 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45\": container with ID starting with ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45 not found: ID does not exist" containerID="ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.104181 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45"} err="failed to get container status \"ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45\": rpc error: code = NotFound desc = could not find container \"ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45\": container with ID starting with ce12eb271b225c9ef0c6cac2bfffced2017148495cca235a91e67b60562f5d45 not found: ID does not exist" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.104212 4749 scope.go:117] "RemoveContainer" containerID="8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb" Nov 29 01:14:23 crc kubenswrapper[4749]: E1129 01:14:23.104470 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb\": container with ID starting with 8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb not found: ID does not exist" containerID="8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb" Nov 29 01:14:23 crc kubenswrapper[4749]: I1129 01:14:23.104504 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb"} err="failed to get container status \"8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb\": rpc error: code = NotFound desc = could not find container \"8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb\": container with ID starting with 8f74b067e81eb93874b611bb571e23a49c9de0ee42abb1dbaa8fa1c383f615eb not found: ID does not exist" Nov 29 01:14:25 crc kubenswrapper[4749]: I1129 01:14:25.374559 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:14:25 crc kubenswrapper[4749]: I1129 01:14:25.374684 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:14:25 crc kubenswrapper[4749]: I1129 01:14:25.374783 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:14:25 crc kubenswrapper[4749]: I1129 01:14:25.375965 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:14:25 crc kubenswrapper[4749]: I1129 01:14:25.376097 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a" gracePeriod=600 Nov 29 01:14:25 crc kubenswrapper[4749]: I1129 01:14:25.787746 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2tb8v"] Nov 29 01:14:25 crc kubenswrapper[4749]: I1129 01:14:25.788775 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2tb8v" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="registry-server" containerID="cri-o://04a66febe7afa72a4ef08bcfc439bb2adc23807b554336955e4156a65961b9b0" gracePeriod=2 Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.096270 4749 generic.go:334] "Generic (PLEG): container finished" podID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerID="04a66febe7afa72a4ef08bcfc439bb2adc23807b554336955e4156a65961b9b0" exitCode=0 Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.096437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tb8v" event={"ID":"305e4c45-d936-45c8-ac76-1b85aa52eb08","Type":"ContainerDied","Data":"04a66febe7afa72a4ef08bcfc439bb2adc23807b554336955e4156a65961b9b0"} Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.101518 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a" exitCode=0 Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.101609 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a"} Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.178156 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.257582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-utilities\") pod \"305e4c45-d936-45c8-ac76-1b85aa52eb08\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.257655 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc8rm\" (UniqueName: \"kubernetes.io/projected/305e4c45-d936-45c8-ac76-1b85aa52eb08-kube-api-access-mc8rm\") pod \"305e4c45-d936-45c8-ac76-1b85aa52eb08\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.257690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-catalog-content\") pod \"305e4c45-d936-45c8-ac76-1b85aa52eb08\" (UID: \"305e4c45-d936-45c8-ac76-1b85aa52eb08\") " Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.259021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-utilities" (OuterVolumeSpecName: "utilities") pod "305e4c45-d936-45c8-ac76-1b85aa52eb08" (UID: "305e4c45-d936-45c8-ac76-1b85aa52eb08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.266240 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305e4c45-d936-45c8-ac76-1b85aa52eb08-kube-api-access-mc8rm" (OuterVolumeSpecName: "kube-api-access-mc8rm") pod "305e4c45-d936-45c8-ac76-1b85aa52eb08" (UID: "305e4c45-d936-45c8-ac76-1b85aa52eb08"). InnerVolumeSpecName "kube-api-access-mc8rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.358849 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.358927 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc8rm\" (UniqueName: \"kubernetes.io/projected/305e4c45-d936-45c8-ac76-1b85aa52eb08-kube-api-access-mc8rm\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.397659 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "305e4c45-d936-45c8-ac76-1b85aa52eb08" (UID: "305e4c45-d936-45c8-ac76-1b85aa52eb08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:14:26 crc kubenswrapper[4749]: I1129 01:14:26.460861 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/305e4c45-d936-45c8-ac76-1b85aa52eb08-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:27 crc kubenswrapper[4749]: I1129 01:14:27.114868 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tb8v" Nov 29 01:14:27 crc kubenswrapper[4749]: I1129 01:14:27.114965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tb8v" event={"ID":"305e4c45-d936-45c8-ac76-1b85aa52eb08","Type":"ContainerDied","Data":"376c5be0d182a3de4aa2d0537342d2a15c043427e0948228674308c10f40b36e"} Nov 29 01:14:27 crc kubenswrapper[4749]: I1129 01:14:27.115894 4749 scope.go:117] "RemoveContainer" containerID="04a66febe7afa72a4ef08bcfc439bb2adc23807b554336955e4156a65961b9b0" Nov 29 01:14:27 crc kubenswrapper[4749]: I1129 01:14:27.119320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"12a23a1e26ba2ea152b73c69f4cde029afff51ef605b21a0ff3648730e7a0a26"} Nov 29 01:14:27 crc kubenswrapper[4749]: I1129 01:14:27.149465 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2tb8v"] Nov 29 01:14:27 crc kubenswrapper[4749]: I1129 01:14:27.155397 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2tb8v"] Nov 29 01:14:27 crc kubenswrapper[4749]: I1129 01:14:27.156665 4749 scope.go:117] "RemoveContainer" containerID="7d93c5bcc8810d5ad8222fbfddf0366fb3a65c3469f66f49303c951b177d882e" Nov 29 01:14:27 crc kubenswrapper[4749]: I1129 01:14:27.191049 4749 scope.go:117] "RemoveContainer" containerID="e3045091df96f05c99a1eb41c14e7003b425fce365a61bf14cb04a3fc057e005" Nov 29 01:14:28 crc kubenswrapper[4749]: I1129 01:14:28.748577 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:14:28 crc kubenswrapper[4749]: I1129 01:14:28.749123 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:14:28 crc kubenswrapper[4749]: I1129 01:14:28.806276 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:14:29 crc kubenswrapper[4749]: I1129 01:14:29.085028 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" path="/var/lib/kubelet/pods/305e4c45-d936-45c8-ac76-1b85aa52eb08/volumes" Nov 29 01:14:29 crc kubenswrapper[4749]: I1129 01:14:29.120894 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:14:29 crc kubenswrapper[4749]: I1129 01:14:29.121345 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:14:29 crc kubenswrapper[4749]: I1129 01:14:29.196007 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:14:29 crc kubenswrapper[4749]: I1129 01:14:29.224221 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:14:29 crc kubenswrapper[4749]: I1129 01:14:29.472231 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wm9jp"] Nov 29 01:14:30 crc kubenswrapper[4749]: I1129 01:14:30.215496 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:14:31 crc kubenswrapper[4749]: I1129 01:14:31.589708 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bj6rw"] Nov 29 01:14:32 crc kubenswrapper[4749]: I1129 01:14:32.161107 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bj6rw" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerName="registry-server" containerID="cri-o://91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77" gracePeriod=2 Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.034329 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.172852 4749 generic.go:334] "Generic (PLEG): container finished" podID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerID="91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77" exitCode=0 Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.172924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj6rw" event={"ID":"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5","Type":"ContainerDied","Data":"91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77"} Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.172967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bj6rw" event={"ID":"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5","Type":"ContainerDied","Data":"5d0e8e995deb4fbed6060bbde014ea8392da25cc0dfdb030bb262d6f8dd80e68"} Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.172990 4749 scope.go:117] "RemoveContainer" containerID="91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.173000 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bj6rw" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.199940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-utilities\") pod \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.200000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-catalog-content\") pod \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.200068 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v9zs\" (UniqueName: \"kubernetes.io/projected/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-kube-api-access-9v9zs\") pod \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\" (UID: \"7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5\") " Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.202560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-utilities" (OuterVolumeSpecName: "utilities") pod "7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" (UID: "7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.204873 4749 scope.go:117] "RemoveContainer" containerID="3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.212456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-kube-api-access-9v9zs" (OuterVolumeSpecName: "kube-api-access-9v9zs") pod "7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" (UID: "7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5"). InnerVolumeSpecName "kube-api-access-9v9zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.253426 4749 scope.go:117] "RemoveContainer" containerID="0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.272579 4749 scope.go:117] "RemoveContainer" containerID="91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77" Nov 29 01:14:33 crc kubenswrapper[4749]: E1129 01:14:33.273081 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77\": container with ID starting with 91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77 not found: ID does not exist" containerID="91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.273127 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77"} err="failed to get container status \"91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77\": rpc error: code = NotFound desc = could not find container \"91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77\": container with ID starting with 91ffb450a4931209e4b0ecbce44489cdd4e2d72661862a35b48eb2ad4b8b2d77 not found: ID does not exist" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.273161 4749 scope.go:117] "RemoveContainer" containerID="3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6" Nov 29 01:14:33 crc kubenswrapper[4749]: E1129 01:14:33.273506 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6\": container with ID starting with 3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6 not found: ID does not exist" containerID="3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.273544 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6"} err="failed to get container status \"3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6\": rpc error: code = NotFound desc = could not find container \"3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6\": container with ID starting with 3f5f3bdbba5cc0a3cc4157c32ffcf4c66e4f73c553a618a90568685d23f1d7a6 not found: ID does not exist" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.273566 4749 scope.go:117] "RemoveContainer" containerID="0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869" Nov 29 01:14:33 crc kubenswrapper[4749]: E1129 01:14:33.273787 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869\": container with ID starting with 0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869 not found: ID does not exist" containerID="0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.273814 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869"} err="failed to get container status \"0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869\": rpc error: code = NotFound desc = could not find container \"0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869\": container with ID starting with 0209d2faaa69e78497bbd910213e5c66bd9461a89c805021981e38554919e869 not found: ID does not exist" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.290334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" (UID: "7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.302004 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.302041 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.302057 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v9zs\" (UniqueName: \"kubernetes.io/projected/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5-kube-api-access-9v9zs\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.531717 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bj6rw"] Nov 29 01:14:33 crc kubenswrapper[4749]: I1129 01:14:33.536896 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bj6rw"] Nov 29 01:14:35 crc kubenswrapper[4749]: I1129 01:14:35.090820 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" path="/var/lib/kubelet/pods/7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5/volumes" Nov 29 01:14:54 crc kubenswrapper[4749]: I1129 01:14:54.509927 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" podUID="6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" containerName="oauth-openshift" containerID="cri-o://9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3" gracePeriod=15 Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.017110 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.060046 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-644868f-rs9tk"] Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.060845 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerName="extract-utilities" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.060966 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerName="extract-utilities" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.061052 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="extract-utilities" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.061143 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="extract-utilities" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.061239 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" containerName="oauth-openshift" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.061328 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" containerName="oauth-openshift" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.061406 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.061483 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.061559 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerName="extract-utilities" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.061648 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerName="extract-utilities" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.061735 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a177ce1b-07ee-4839-8725-539c031f9610" containerName="extract-content" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.061806 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a177ce1b-07ee-4839-8725-539c031f9610" containerName="extract-content" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.061883 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a177ce1b-07ee-4839-8725-539c031f9610" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.061955 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a177ce1b-07ee-4839-8725-539c031f9610" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.062031 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerName="extract-content" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.062101 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerName="extract-content" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.062214 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="extract-content" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.062291 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="extract-content" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.062363 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.062442 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.062519 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.062599 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.062685 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerName="extract-content" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.062756 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerName="extract-content" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.062830 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a177ce1b-07ee-4839-8725-539c031f9610" containerName="extract-utilities" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.062899 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a177ce1b-07ee-4839-8725-539c031f9610" containerName="extract-utilities" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.063100 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" containerName="oauth-openshift" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.063219 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="305e4c45-d936-45c8-ac76-1b85aa52eb08" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.063341 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c1388f-39e4-470f-a359-46e05c3963b0" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.063416 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a177ce1b-07ee-4839-8725-539c031f9610" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.063487 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dea467d-7ff5-488f-b8cd-0a7bb63b9ea5" containerName="registry-server" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.064120 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.074267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-ocp-branding-template\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.074562 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-cliconfig\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.074676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-login\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.074837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-provider-selection\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.074939 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-dir\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075034 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-service-ca\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-router-certs\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-session\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075413 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-error\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6x8w\" (UniqueName: \"kubernetes.io/projected/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-kube-api-access-c6x8w\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-policies\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075737 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-idp-0-file-data\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075764 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-serving-cert\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.075803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-trusted-ca-bundle\") pod \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\" (UID: \"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917\") " Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.076372 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.076674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.076756 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.076890 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.083741 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.084928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.085936 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.088677 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.090105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.093518 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.103070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.103705 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.104072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-kube-api-access-c6x8w" (OuterVolumeSpecName: "kube-api-access-c6x8w") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "kube-api-access-c6x8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.104256 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.122146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" (UID: "6f67a3e1-0ab3-4e01-a68d-732b1f9e6917"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.129513 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-644868f-rs9tk"] Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-audit-policies\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-session\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-router-certs\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179702 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-service-ca\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179770 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-audit-dir\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179812 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179852 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179887 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-login\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.179930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdsh6\" (UniqueName: \"kubernetes.io/projected/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-kube-api-access-qdsh6\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-error\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180094 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180127 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180150 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180172 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180194 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180270 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6x8w\" (UniqueName: \"kubernetes.io/projected/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-kube-api-access-c6x8w\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180292 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180312 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180333 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.180359 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.181421 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.181461 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.282998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-router-certs\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-service-ca\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-audit-dir\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-login\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdsh6\" (UniqueName: \"kubernetes.io/projected/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-kube-api-access-qdsh6\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-error\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283265 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-audit-policies\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283286 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-session\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.283371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.284252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-audit-dir\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.286044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-audit-policies\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.286106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.286900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.287783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-service-ca\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.289443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.290031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-login\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.290165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.290698 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.292395 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-template-error\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.293058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-session\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.295224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-system-router-certs\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.296150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.312919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdsh6\" (UniqueName: \"kubernetes.io/projected/2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf-kube-api-access-qdsh6\") pod \"oauth-openshift-644868f-rs9tk\" (UID: \"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf\") " pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.355983 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" containerID="9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3" exitCode=0 Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.356057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" event={"ID":"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917","Type":"ContainerDied","Data":"9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3"} Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.356091 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.356113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wm9jp" event={"ID":"6f67a3e1-0ab3-4e01-a68d-732b1f9e6917","Type":"ContainerDied","Data":"1c173ef22cd2a1d3a31bd3c2293672c8b2fd997d8673575e9245d2e46de4434d"} Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.356148 4749 scope.go:117] "RemoveContainer" containerID="9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.384330 4749 scope.go:117] "RemoveContainer" containerID="9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3" Nov 29 01:14:55 crc kubenswrapper[4749]: E1129 01:14:55.385003 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3\": container with ID starting with 9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3 not found: ID does not exist" containerID="9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.385084 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3"} err="failed to get container status \"9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3\": rpc error: code = NotFound desc = could not find container \"9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3\": container with ID starting with 9f744e4eb283ad05e9ff1fd58265cdf9e1ccec927183432737af68216cfe7cf3 not found: ID does not exist" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.406676 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wm9jp"] Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.408107 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wm9jp"] Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.429817 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:55 crc kubenswrapper[4749]: I1129 01:14:55.715306 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-644868f-rs9tk"] Nov 29 01:14:56 crc kubenswrapper[4749]: I1129 01:14:56.369809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-644868f-rs9tk" event={"ID":"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf","Type":"ContainerStarted","Data":"cf46e963fb7ec94d7e121ac2a8596bcc848727262e086ac321822e2468ba1a95"} Nov 29 01:14:56 crc kubenswrapper[4749]: I1129 01:14:56.371258 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:56 crc kubenswrapper[4749]: I1129 01:14:56.371509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-644868f-rs9tk" event={"ID":"2a82b7d6-7ce8-4c4c-9c94-4c0bd74b3bdf","Type":"ContainerStarted","Data":"a3e6ff31878e59ae0703a4975b4a60c2404d42a26c06c1c23b994f88033419b5"} Nov 29 01:14:56 crc kubenswrapper[4749]: I1129 01:14:56.381334 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-644868f-rs9tk" Nov 29 01:14:56 crc kubenswrapper[4749]: I1129 01:14:56.412877 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-644868f-rs9tk" podStartSLOduration=27.412837574 podStartE2EDuration="27.412837574s" podCreationTimestamp="2025-11-29 01:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:14:56.410568868 +0000 UTC m=+239.582718765" watchObservedRunningTime="2025-11-29 01:14:56.412837574 +0000 UTC m=+239.584987471" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.093889 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f67a3e1-0ab3-4e01-a68d-732b1f9e6917" path="/var/lib/kubelet/pods/6f67a3e1-0ab3-4e01-a68d-732b1f9e6917/volumes" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.225746 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.226699 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.226955 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.227215 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457" gracePeriod=15 Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.227342 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275" gracePeriod=15 Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.227391 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6" gracePeriod=15 Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.227337 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870" gracePeriod=15 Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.227450 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103" gracePeriod=15 Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.228825 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 01:14:57 crc kubenswrapper[4749]: E1129 01:14:57.229280 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229310 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 01:14:57 crc kubenswrapper[4749]: E1129 01:14:57.229325 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229335 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 01:14:57 crc kubenswrapper[4749]: E1129 01:14:57.229348 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229356 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 01:14:57 crc kubenswrapper[4749]: E1129 01:14:57.229370 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229378 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 29 01:14:57 crc kubenswrapper[4749]: E1129 01:14:57.229390 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229399 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 01:14:57 crc kubenswrapper[4749]: E1129 01:14:57.229412 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229420 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 01:14:57 crc kubenswrapper[4749]: E1129 01:14:57.229434 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229442 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229593 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229614 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229627 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229640 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229656 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.229664 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.301581 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.320179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.320286 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.320382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.320418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.320489 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.320537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.320779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.320858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.422948 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.423118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.423209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.423261 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.423334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.423407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.423449 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.423479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.424415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.424556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.425014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.425116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.425270 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.425488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.425580 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.425797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: I1129 01:14:57.593795 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:14:57 crc kubenswrapper[4749]: W1129 01:14:57.625149 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-820a131efea36f7c389f4e9150f5e66e5d4e853963efb54eea6be604c54a0d16 WatchSource:0}: Error finding container 820a131efea36f7c389f4e9150f5e66e5d4e853963efb54eea6be604c54a0d16: Status 404 returned error can't find the container with id 820a131efea36f7c389f4e9150f5e66e5d4e853963efb54eea6be604c54a0d16 Nov 29 01:14:57 crc kubenswrapper[4749]: E1129 01:14:57.629565 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187c55426251de90 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 01:14:57.628126864 +0000 UTC m=+240.800276721,LastTimestamp:2025-11-29 01:14:57.628126864 +0000 UTC m=+240.800276721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.402347 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.404179 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.405181 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275" exitCode=0 Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.405232 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870" exitCode=0 Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.405248 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6" exitCode=0 Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.405261 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103" exitCode=2 Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.405336 4749 scope.go:117] "RemoveContainer" containerID="fdc577111673604363c9338cc6f0d4ce2798904328a2b7eb34a6061a3d721521" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.408240 4749 generic.go:334] "Generic (PLEG): container finished" podID="6ce71cce-5270-419e-a3c1-e153e1372586" containerID="012a9fc051a512fe5b20889032f04c285e22e357d630b536727b35b21293c3ea" exitCode=0 Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.408361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6ce71cce-5270-419e-a3c1-e153e1372586","Type":"ContainerDied","Data":"012a9fc051a512fe5b20889032f04c285e22e357d630b536727b35b21293c3ea"} Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.411300 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.411874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d82b38f225395fe91022d3efd8aeb81b8c159648efe23122a2ff7e7483ec7d2f"} Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.411943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"820a131efea36f7c389f4e9150f5e66e5d4e853963efb54eea6be604c54a0d16"} Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.413411 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.413949 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.414601 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.415016 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:58 crc kubenswrapper[4749]: I1129 01:14:58.415473 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.440695 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 01:14:59 crc kubenswrapper[4749]: E1129 01:14:59.729140 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457.scope\": RecentStats: unable to find data in memory cache]" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.767389 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.768992 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.770181 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.771130 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.771981 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.772758 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.773114 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.773340 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.773534 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.859558 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-var-lock\") pod \"6ce71cce-5270-419e-a3c1-e153e1372586\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.859648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.859708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-var-lock" (OuterVolumeSpecName: "var-lock") pod "6ce71cce-5270-419e-a3c1-e153e1372586" (UID: "6ce71cce-5270-419e-a3c1-e153e1372586"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.859759 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ce71cce-5270-419e-a3c1-e153e1372586-kube-api-access\") pod \"6ce71cce-5270-419e-a3c1-e153e1372586\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.859827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-kubelet-dir\") pod \"6ce71cce-5270-419e-a3c1-e153e1372586\" (UID: \"6ce71cce-5270-419e-a3c1-e153e1372586\") " Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.859865 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.859940 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.859900 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.860029 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.860025 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6ce71cce-5270-419e-a3c1-e153e1372586" (UID: "6ce71cce-5270-419e-a3c1-e153e1372586"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.860253 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.860698 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.860933 4749 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.860952 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.860987 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6ce71cce-5270-419e-a3c1-e153e1372586-var-lock\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.860999 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.869585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce71cce-5270-419e-a3c1-e153e1372586-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6ce71cce-5270-419e-a3c1-e153e1372586" (UID: "6ce71cce-5270-419e-a3c1-e153e1372586"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:14:59 crc kubenswrapper[4749]: I1129 01:14:59.963316 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6ce71cce-5270-419e-a3c1-e153e1372586-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.458753 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.460163 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457" exitCode=0 Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.460344 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.460415 4749 scope.go:117] "RemoveContainer" containerID="e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.463988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6ce71cce-5270-419e-a3c1-e153e1372586","Type":"ContainerDied","Data":"6fd3db2ce2c55b95d009a8a41a51da2b18735119edd3bbb954dcf20f1ca9c844"} Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.464044 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd3db2ce2c55b95d009a8a41a51da2b18735119edd3bbb954dcf20f1ca9c844" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.464039 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.486427 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.487233 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.488824 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.497929 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.499264 4749 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.499858 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.501795 4749 scope.go:117] "RemoveContainer" containerID="d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.522448 4749 scope.go:117] "RemoveContainer" containerID="8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.561489 4749 scope.go:117] "RemoveContainer" containerID="69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.585042 4749 scope.go:117] "RemoveContainer" containerID="0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.603467 4749 scope.go:117] "RemoveContainer" containerID="65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.633528 4749 scope.go:117] "RemoveContainer" containerID="e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275" Nov 29 01:15:00 crc kubenswrapper[4749]: E1129 01:15:00.634737 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\": container with ID starting with e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275 not found: ID does not exist" containerID="e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.634793 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275"} err="failed to get container status \"e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\": rpc error: code = NotFound desc = could not find container \"e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275\": container with ID starting with e07d453edc6b5bbe9c2ca318bd9361dff8081958727ed4700bf5bb76a76c9275 not found: ID does not exist" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.634836 4749 scope.go:117] "RemoveContainer" containerID="d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870" Nov 29 01:15:00 crc kubenswrapper[4749]: E1129 01:15:00.635496 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\": container with ID starting with d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870 not found: ID does not exist" containerID="d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.635552 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870"} err="failed to get container status \"d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\": rpc error: code = NotFound desc = could not find container \"d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870\": container with ID starting with d2ca2e24225d5083c39c5e8c9d58de5346a2df0343a2f15b491cf056fb972870 not found: ID does not exist" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.635590 4749 scope.go:117] "RemoveContainer" containerID="8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6" Nov 29 01:15:00 crc kubenswrapper[4749]: E1129 01:15:00.636641 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\": container with ID starting with 8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6 not found: ID does not exist" containerID="8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.636678 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6"} err="failed to get container status \"8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\": rpc error: code = NotFound desc = could not find container \"8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6\": container with ID starting with 8fecf545b095c5eeb6a143a86b4fe8427aeefd75d34a7961688ade6415c4e9e6 not found: ID does not exist" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.636701 4749 scope.go:117] "RemoveContainer" containerID="69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103" Nov 29 01:15:00 crc kubenswrapper[4749]: E1129 01:15:00.636948 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\": container with ID starting with 69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103 not found: ID does not exist" containerID="69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.636976 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103"} err="failed to get container status \"69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\": rpc error: code = NotFound desc = could not find container \"69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103\": container with ID starting with 69f8bd7455e42f1d1b93da2264f6c2ff46662d94d13959afd798513240f7f103 not found: ID does not exist" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.636991 4749 scope.go:117] "RemoveContainer" containerID="0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457" Nov 29 01:15:00 crc kubenswrapper[4749]: E1129 01:15:00.637538 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\": container with ID starting with 0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457 not found: ID does not exist" containerID="0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.637577 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457"} err="failed to get container status \"0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\": rpc error: code = NotFound desc = could not find container \"0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457\": container with ID starting with 0d825cb1f890014d2d050e4783c248f20afebf7d4ad07a3f27410db81f490457 not found: ID does not exist" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.637601 4749 scope.go:117] "RemoveContainer" containerID="65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e" Nov 29 01:15:00 crc kubenswrapper[4749]: E1129 01:15:00.637803 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\": container with ID starting with 65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e not found: ID does not exist" containerID="65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e" Nov 29 01:15:00 crc kubenswrapper[4749]: I1129 01:15:00.637828 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e"} err="failed to get container status \"65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\": rpc error: code = NotFound desc = could not find container \"65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e\": container with ID starting with 65b3a18984542c4900db4ed41b860f1799b9a242f3bd07114eb64fd81d9c919e not found: ID does not exist" Nov 29 01:15:01 crc kubenswrapper[4749]: I1129 01:15:01.082975 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 29 01:15:01 crc kubenswrapper[4749]: E1129 01:15:01.544152 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187c55426251de90 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-29 01:14:57.628126864 +0000 UTC m=+240.800276721,LastTimestamp:2025-11-29 01:14:57.628126864 +0000 UTC m=+240.800276721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 29 01:15:02 crc kubenswrapper[4749]: E1129 01:15:02.888967 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:02 crc kubenswrapper[4749]: E1129 01:15:02.889527 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:02 crc kubenswrapper[4749]: E1129 01:15:02.890343 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:02 crc kubenswrapper[4749]: E1129 01:15:02.890727 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:02 crc kubenswrapper[4749]: E1129 01:15:02.891320 4749 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:02 crc kubenswrapper[4749]: I1129 01:15:02.891370 4749 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 29 01:15:02 crc kubenswrapper[4749]: E1129 01:15:02.891681 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Nov 29 01:15:03 crc kubenswrapper[4749]: E1129 01:15:03.092309 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Nov 29 01:15:03 crc kubenswrapper[4749]: E1129 01:15:03.493664 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Nov 29 01:15:04 crc kubenswrapper[4749]: E1129 01:15:04.295286 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Nov 29 01:15:05 crc kubenswrapper[4749]: E1129 01:15:05.897346 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="3.2s" Nov 29 01:15:06 crc kubenswrapper[4749]: E1129 01:15:06.162098 4749 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" volumeName="registry-storage" Nov 29 01:15:07 crc kubenswrapper[4749]: I1129 01:15:07.082302 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:07 crc kubenswrapper[4749]: I1129 01:15:07.083800 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.075116 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.076646 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.077067 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:09 crc kubenswrapper[4749]: E1129 01:15:09.098014 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="6.4s" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.107097 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.107177 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:09 crc kubenswrapper[4749]: E1129 01:15:09.107765 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.108610 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:09 crc kubenswrapper[4749]: W1129 01:15:09.131558 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-448a5898080e8bdf9e49e2ee031ea6a0558d4eceb916dc2f7a8715ae947065b7 WatchSource:0}: Error finding container 448a5898080e8bdf9e49e2ee031ea6a0558d4eceb916dc2f7a8715ae947065b7: Status 404 returned error can't find the container with id 448a5898080e8bdf9e49e2ee031ea6a0558d4eceb916dc2f7a8715ae947065b7 Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.564985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d4ce39c08a02b3f49e00c8fa92f77f486f12b3078b6d9ec20ef40fc01fc0dff"} Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.565564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"448a5898080e8bdf9e49e2ee031ea6a0558d4eceb916dc2f7a8715ae947065b7"} Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.566017 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.566052 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:09 crc kubenswrapper[4749]: E1129 01:15:09.566850 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.567884 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:09 crc kubenswrapper[4749]: I1129 01:15:09.568448 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:10 crc kubenswrapper[4749]: I1129 01:15:10.576724 4749 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="9d4ce39c08a02b3f49e00c8fa92f77f486f12b3078b6d9ec20ef40fc01fc0dff" exitCode=0 Nov 29 01:15:10 crc kubenswrapper[4749]: I1129 01:15:10.576818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"9d4ce39c08a02b3f49e00c8fa92f77f486f12b3078b6d9ec20ef40fc01fc0dff"} Nov 29 01:15:10 crc kubenswrapper[4749]: I1129 01:15:10.577371 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:10 crc kubenswrapper[4749]: I1129 01:15:10.577397 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:10 crc kubenswrapper[4749]: I1129 01:15:10.578102 4749 status_manager.go:851] "Failed to get status for pod" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:10 crc kubenswrapper[4749]: E1129 01:15:10.578131 4749 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:10 crc kubenswrapper[4749]: I1129 01:15:10.578765 4749 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Nov 29 01:15:11 crc kubenswrapper[4749]: I1129 01:15:11.600800 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 29 01:15:11 crc kubenswrapper[4749]: I1129 01:15:11.602906 4749 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655" exitCode=1 Nov 29 01:15:11 crc kubenswrapper[4749]: I1129 01:15:11.603189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655"} Nov 29 01:15:11 crc kubenswrapper[4749]: I1129 01:15:11.604345 4749 scope.go:117] "RemoveContainer" containerID="39b031ab60bee7915ebc21b5485dbf8b9fb89e820bd829b280346b40a435e655" Nov 29 01:15:11 crc kubenswrapper[4749]: I1129 01:15:11.609649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0dd95304c44292ba93217cab92ba4329a33a9d780af19ce39238d5bb2ac7a170"} Nov 29 01:15:11 crc kubenswrapper[4749]: I1129 01:15:11.609696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9e11f1d508e5ba8b20105a4b4ab553900559a5cda7beae48a3b862e0a314babb"} Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.578539 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.630343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"da4a3b3decd7479da1962718c51caf84280e367f7a5200ed6ef3940bdc29ce3e"} Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.630390 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8e25d8dbf7ad42b854b404c7f832b49000b11534c7a6d9811fcf5489ee778f36"} Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.630407 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2b045729965e177c112a11676439c2c61512eb3000dfb4a556da9e87a0874c34"} Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.630693 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.630712 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.631150 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.634477 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 29 01:15:12 crc kubenswrapper[4749]: I1129 01:15:12.634535 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7ea5ad31aaa829cee616a7a5c88b073002c1e177fb154fb3c6f09828208f695c"} Nov 29 01:15:14 crc kubenswrapper[4749]: I1129 01:15:14.108764 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:14 crc kubenswrapper[4749]: I1129 01:15:14.109452 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:14 crc kubenswrapper[4749]: I1129 01:15:14.113836 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:15:14 crc kubenswrapper[4749]: I1129 01:15:14.116029 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:17 crc kubenswrapper[4749]: I1129 01:15:17.716952 4749 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:17 crc kubenswrapper[4749]: I1129 01:15:17.922671 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f9afddb2-e7f9-43ac-ab06-d20403498558" Nov 29 01:15:18 crc kubenswrapper[4749]: I1129 01:15:18.681036 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:18 crc kubenswrapper[4749]: I1129 01:15:18.681089 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:18 crc kubenswrapper[4749]: I1129 01:15:18.685323 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f9afddb2-e7f9-43ac-ab06-d20403498558" Nov 29 01:15:18 crc kubenswrapper[4749]: I1129 01:15:18.688353 4749 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://9e11f1d508e5ba8b20105a4b4ab553900559a5cda7beae48a3b862e0a314babb" Nov 29 01:15:18 crc kubenswrapper[4749]: I1129 01:15:18.688564 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:19 crc kubenswrapper[4749]: I1129 01:15:19.688669 4749 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:19 crc kubenswrapper[4749]: I1129 01:15:19.689444 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="699d46ba-63d2-4366-b76f-344c5ef7bcdb" Nov 29 01:15:19 crc kubenswrapper[4749]: I1129 01:15:19.693317 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f9afddb2-e7f9-43ac-ab06-d20403498558" Nov 29 01:15:22 crc kubenswrapper[4749]: I1129 01:15:22.578497 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:15:22 crc kubenswrapper[4749]: I1129 01:15:22.585696 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:15:22 crc kubenswrapper[4749]: I1129 01:15:22.719251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 29 01:15:27 crc kubenswrapper[4749]: I1129 01:15:27.412358 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 29 01:15:27 crc kubenswrapper[4749]: I1129 01:15:27.487630 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 29 01:15:28 crc kubenswrapper[4749]: I1129 01:15:28.714086 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 29 01:15:28 crc kubenswrapper[4749]: I1129 01:15:28.740362 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 29 01:15:28 crc kubenswrapper[4749]: I1129 01:15:28.784759 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 29 01:15:29 crc kubenswrapper[4749]: I1129 01:15:29.054709 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 29 01:15:29 crc kubenswrapper[4749]: I1129 01:15:29.151498 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 29 01:15:29 crc kubenswrapper[4749]: I1129 01:15:29.212640 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.321405 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.383852 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.460087 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.526816 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.574756 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.684024 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.689270 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.722965 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.753768 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.756563 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.766736 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.949917 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 29 01:15:30 crc kubenswrapper[4749]: I1129 01:15:30.954875 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.037845 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.158425 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.175534 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.257926 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.324668 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.414606 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.458665 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.662376 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.670699 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.693690 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.776668 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.795746 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.810909 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.912932 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.927285 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.943731 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 01:15:31 crc kubenswrapper[4749]: I1129 01:15:31.980359 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.010292 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.122522 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.246887 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.256163 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.278290 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.288381 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.357254 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.377827 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.390907 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.533549 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.601041 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.686517 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.744276 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.768126 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.841371 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.988763 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 29 01:15:32 crc kubenswrapper[4749]: I1129 01:15:32.992068 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.009183 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.095188 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.397399 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.430910 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.560727 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.567405 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.571868 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.753316 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.762387 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.901062 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 29 01:15:33 crc kubenswrapper[4749]: I1129 01:15:33.997074 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.008653 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.019502 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.069554 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.077567 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.152890 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.167718 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.168760 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.216342 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.489724 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.594666 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.616511 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.714132 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.835434 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.850081 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.880511 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.987426 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 29 01:15:34 crc kubenswrapper[4749]: I1129 01:15:34.990039 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.205074 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.242593 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.266727 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.285515 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.354837 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.373817 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.428645 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.492828 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.540846 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.574379 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.584061 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.636993 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.641435 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.655515 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.687012 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.765964 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.774335 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.781648 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.817045 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.819916 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.825064 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.876723 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.904607 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.909417 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 29 01:15:35 crc kubenswrapper[4749]: I1129 01:15:35.935344 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.199025 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.224140 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.362693 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.465671 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.828446 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.874136 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.888360 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.909525 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 29 01:15:36 crc kubenswrapper[4749]: I1129 01:15:36.997106 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.145618 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.186738 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.188774 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.217353 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.236081 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.351293 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.390641 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.393150 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.405327 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.526310 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.530956 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.618730 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.619861 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.664742 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.692609 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.749956 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.843118 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.880630 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.914893 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 29 01:15:37 crc kubenswrapper[4749]: I1129 01:15:37.993098 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.076975 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.135069 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.152879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.157680 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.193878 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.257780 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.385082 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.471091 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.477481 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.543675 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.575924 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.618562 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.726979 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.779341 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.814855 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.839156 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.843578 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.850808 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 29 01:15:38 crc kubenswrapper[4749]: I1129 01:15:38.908660 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.176385 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.213479 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.260412 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.494035 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.515009 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.521444 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.521411607 podStartE2EDuration="42.521411607s" podCreationTimestamp="2025-11-29 01:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:15:17.742822499 +0000 UTC m=+260.914972386" watchObservedRunningTime="2025-11-29 01:15:39.521411607 +0000 UTC m=+282.693561474" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.522518 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.522584 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z"] Nov 29 01:15:39 crc kubenswrapper[4749]: E1129 01:15:39.522880 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" containerName="installer" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.522907 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" containerName="installer" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.523049 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce71cce-5270-419e-a3c1-e153e1372586" containerName="installer" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.523809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.529916 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.529961 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.532874 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.544362 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.544808 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.555689 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-secret-volume\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.556057 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-config-volume\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.556291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m88d\" (UniqueName: \"kubernetes.io/projected/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-kube-api-access-7m88d\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.567354 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.5673262 podStartE2EDuration="22.5673262s" podCreationTimestamp="2025-11-29 01:15:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:15:39.562376171 +0000 UTC m=+282.734526068" watchObservedRunningTime="2025-11-29 01:15:39.5673262 +0000 UTC m=+282.739476087" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.645152 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.658613 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-secret-volume\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.658693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-config-volume\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.658735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m88d\" (UniqueName: \"kubernetes.io/projected/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-kube-api-access-7m88d\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.660416 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-config-volume\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.675912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-secret-volume\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.689013 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.694071 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.696898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m88d\" (UniqueName: \"kubernetes.io/projected/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-kube-api-access-7m88d\") pod \"collect-profiles-29406315-dh88z\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.703828 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.752831 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.787800 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.791879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.796976 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.852633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.906260 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.913219 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.940667 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.965800 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 29 01:15:39 crc kubenswrapper[4749]: I1129 01:15:39.969969 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.044077 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.060161 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.091908 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.147764 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z"] Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.178286 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.225372 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.250229 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.273091 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.316576 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.329863 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.331941 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.371301 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.402783 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.413877 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.420938 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.429457 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.455360 4749 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.455859 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d82b38f225395fe91022d3efd8aeb81b8c159648efe23122a2ff7e7483ec7d2f" gracePeriod=5 Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.459488 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.593823 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.594036 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.624231 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.632781 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.713138 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.743439 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.776055 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.897780 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.927405 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.968304 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 29 01:15:40 crc kubenswrapper[4749]: I1129 01:15:40.972815 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.008326 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.122026 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.227679 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.340333 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.575525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" event={"ID":"5e79e70c-cf41-46f1-9df6-f13b5ff21f63","Type":"ContainerStarted","Data":"f1fbea0bdc5f5db766193201f4381eab3ddca62809b355c227cb9566d1a14e06"} Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.575902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" event={"ID":"5e79e70c-cf41-46f1-9df6-f13b5ff21f63","Type":"ContainerStarted","Data":"a30e8dd5a8c482cd34871d0bf05ff7442bf78d2b3566ab3baaec7a0594ff2a2d"} Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.594831 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.597620 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.644064 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.793624 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.818636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.878691 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.896811 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.915413 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 29 01:15:41 crc kubenswrapper[4749]: I1129 01:15:41.995040 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.001688 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.002166 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.005086 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.021161 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.100034 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.187258 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.206396 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.452358 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.505911 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.552336 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.593686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" event={"ID":"5e79e70c-cf41-46f1-9df6-f13b5ff21f63","Type":"ContainerDied","Data":"f1fbea0bdc5f5db766193201f4381eab3ddca62809b355c227cb9566d1a14e06"} Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.593486 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e79e70c-cf41-46f1-9df6-f13b5ff21f63" containerID="f1fbea0bdc5f5db766193201f4381eab3ddca62809b355c227cb9566d1a14e06" exitCode=0 Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.603961 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.606023 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.735146 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.753986 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.776466 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.874078 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 01:15:42 crc kubenswrapper[4749]: I1129 01:15:42.901396 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.157087 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.199516 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.374046 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.472237 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.493099 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.597145 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.847191 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.870610 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.891676 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.937531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-secret-volume\") pod \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.937693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m88d\" (UniqueName: \"kubernetes.io/projected/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-kube-api-access-7m88d\") pod \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.937791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-config-volume\") pod \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\" (UID: \"5e79e70c-cf41-46f1-9df6-f13b5ff21f63\") " Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.938692 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e79e70c-cf41-46f1-9df6-f13b5ff21f63" (UID: "5e79e70c-cf41-46f1-9df6-f13b5ff21f63"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.946302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-kube-api-access-7m88d" (OuterVolumeSpecName: "kube-api-access-7m88d") pod "5e79e70c-cf41-46f1-9df6-f13b5ff21f63" (UID: "5e79e70c-cf41-46f1-9df6-f13b5ff21f63"). InnerVolumeSpecName "kube-api-access-7m88d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.946671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e79e70c-cf41-46f1-9df6-f13b5ff21f63" (UID: "5e79e70c-cf41-46f1-9df6-f13b5ff21f63"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.984399 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 29 01:15:43 crc kubenswrapper[4749]: I1129 01:15:43.986138 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.039444 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.039536 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.039559 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m88d\" (UniqueName: \"kubernetes.io/projected/5e79e70c-cf41-46f1-9df6-f13b5ff21f63-kube-api-access-7m88d\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.083640 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.134065 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.624237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" event={"ID":"5e79e70c-cf41-46f1-9df6-f13b5ff21f63","Type":"ContainerDied","Data":"a30e8dd5a8c482cd34871d0bf05ff7442bf78d2b3566ab3baaec7a0594ff2a2d"} Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.624317 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a30e8dd5a8c482cd34871d0bf05ff7442bf78d2b3566ab3baaec7a0594ff2a2d" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.624431 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.779926 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.858178 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 29 01:15:44 crc kubenswrapper[4749]: I1129 01:15:44.959867 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 29 01:15:45 crc kubenswrapper[4749]: I1129 01:15:45.074766 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 29 01:15:45 crc kubenswrapper[4749]: I1129 01:15:45.235730 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 29 01:15:45 crc kubenswrapper[4749]: I1129 01:15:45.595331 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 29 01:15:45 crc kubenswrapper[4749]: I1129 01:15:45.712965 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 29 01:15:46 crc kubenswrapper[4749]: I1129 01:15:46.641252 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 29 01:15:46 crc kubenswrapper[4749]: I1129 01:15:46.641703 4749 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d82b38f225395fe91022d3efd8aeb81b8c159648efe23122a2ff7e7483ec7d2f" exitCode=137 Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.179450 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.179570 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.289810 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.289926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.289967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.289991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290190 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290239 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290383 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290877 4749 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290930 4749 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290956 4749 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.290980 4749 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.301237 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.392549 4749 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.698089 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.698222 4749 scope.go:117] "RemoveContainer" containerID="d82b38f225395fe91022d3efd8aeb81b8c159648efe23122a2ff7e7483ec7d2f" Nov 29 01:15:47 crc kubenswrapper[4749]: I1129 01:15:47.698322 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 29 01:15:49 crc kubenswrapper[4749]: I1129 01:15:49.087431 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 29 01:15:49 crc kubenswrapper[4749]: I1129 01:15:49.087797 4749 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Nov 29 01:15:49 crc kubenswrapper[4749]: I1129 01:15:49.100875 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 01:15:49 crc kubenswrapper[4749]: I1129 01:15:49.100933 4749 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="40a26278-848f-4e93-82d5-adeec3a30dc7" Nov 29 01:15:49 crc kubenswrapper[4749]: I1129 01:15:49.104132 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 29 01:15:49 crc kubenswrapper[4749]: I1129 01:15:49.104151 4749 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="40a26278-848f-4e93-82d5-adeec3a30dc7" Nov 29 01:15:58 crc kubenswrapper[4749]: I1129 01:15:58.793812 4749 generic.go:334] "Generic (PLEG): container finished" podID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerID="b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf" exitCode=0 Nov 29 01:15:58 crc kubenswrapper[4749]: I1129 01:15:58.793928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" event={"ID":"d34de2ad-4a60-49b0-b63b-5f610370bbd4","Type":"ContainerDied","Data":"b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf"} Nov 29 01:15:58 crc kubenswrapper[4749]: I1129 01:15:58.796296 4749 scope.go:117] "RemoveContainer" containerID="b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf" Nov 29 01:15:59 crc kubenswrapper[4749]: I1129 01:15:59.806946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" event={"ID":"d34de2ad-4a60-49b0-b63b-5f610370bbd4","Type":"ContainerStarted","Data":"7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495"} Nov 29 01:15:59 crc kubenswrapper[4749]: I1129 01:15:59.809975 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:15:59 crc kubenswrapper[4749]: I1129 01:15:59.815367 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.264230 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f75rz"] Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.264903 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" podUID="e05a47eb-0468-48cb-9e4e-19156acdda3a" containerName="controller-manager" containerID="cri-o://625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c" gracePeriod=30 Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.371768 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz"] Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.372072 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" podUID="0e05f53f-1275-42b8-8d25-1b6f96be0121" containerName="route-controller-manager" containerID="cri-o://ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a" gracePeriod=30 Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.722333 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.775659 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05a47eb-0468-48cb-9e4e-19156acdda3a-serving-cert\") pod \"e05a47eb-0468-48cb-9e4e-19156acdda3a\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.775729 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd7ht\" (UniqueName: \"kubernetes.io/projected/e05a47eb-0468-48cb-9e4e-19156acdda3a-kube-api-access-dd7ht\") pod \"e05a47eb-0468-48cb-9e4e-19156acdda3a\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.775822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-config\") pod \"e05a47eb-0468-48cb-9e4e-19156acdda3a\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.775846 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-proxy-ca-bundles\") pod \"e05a47eb-0468-48cb-9e4e-19156acdda3a\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.775997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-client-ca\") pod \"e05a47eb-0468-48cb-9e4e-19156acdda3a\" (UID: \"e05a47eb-0468-48cb-9e4e-19156acdda3a\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.777335 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-client-ca" (OuterVolumeSpecName: "client-ca") pod "e05a47eb-0468-48cb-9e4e-19156acdda3a" (UID: "e05a47eb-0468-48cb-9e4e-19156acdda3a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.777791 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-config" (OuterVolumeSpecName: "config") pod "e05a47eb-0468-48cb-9e4e-19156acdda3a" (UID: "e05a47eb-0468-48cb-9e4e-19156acdda3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.777955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e05a47eb-0468-48cb-9e4e-19156acdda3a" (UID: "e05a47eb-0468-48cb-9e4e-19156acdda3a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.783567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05a47eb-0468-48cb-9e4e-19156acdda3a-kube-api-access-dd7ht" (OuterVolumeSpecName: "kube-api-access-dd7ht") pod "e05a47eb-0468-48cb-9e4e-19156acdda3a" (UID: "e05a47eb-0468-48cb-9e4e-19156acdda3a"). InnerVolumeSpecName "kube-api-access-dd7ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.783970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e05a47eb-0468-48cb-9e4e-19156acdda3a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e05a47eb-0468-48cb-9e4e-19156acdda3a" (UID: "e05a47eb-0468-48cb-9e4e-19156acdda3a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.787136 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.845900 4749 generic.go:334] "Generic (PLEG): container finished" podID="0e05f53f-1275-42b8-8d25-1b6f96be0121" containerID="ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a" exitCode=0 Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.845970 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.846007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" event={"ID":"0e05f53f-1275-42b8-8d25-1b6f96be0121","Type":"ContainerDied","Data":"ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a"} Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.846048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz" event={"ID":"0e05f53f-1275-42b8-8d25-1b6f96be0121","Type":"ContainerDied","Data":"a61ae5cd38027aada6ef17a75c139f6f725bed3e2b865f8e6ca6f4c31ebe1d8a"} Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.846068 4749 scope.go:117] "RemoveContainer" containerID="ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.848247 4749 generic.go:334] "Generic (PLEG): container finished" podID="e05a47eb-0468-48cb-9e4e-19156acdda3a" containerID="625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c" exitCode=0 Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.848302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" event={"ID":"e05a47eb-0468-48cb-9e4e-19156acdda3a","Type":"ContainerDied","Data":"625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c"} Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.848336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" event={"ID":"e05a47eb-0468-48cb-9e4e-19156acdda3a","Type":"ContainerDied","Data":"167e8adc5871f830d8bc301e28069ee7630e247de465ce2d993e57f792633a9e"} Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.848396 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-f75rz" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.865172 4749 scope.go:117] "RemoveContainer" containerID="ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a" Nov 29 01:16:04 crc kubenswrapper[4749]: E1129 01:16:04.865794 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a\": container with ID starting with ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a not found: ID does not exist" containerID="ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.865831 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a"} err="failed to get container status \"ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a\": rpc error: code = NotFound desc = could not find container \"ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a\": container with ID starting with ca96ce0b6712766c44d033e886617feb7846130fd14a458c4a17a2a98559934a not found: ID does not exist" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.865857 4749 scope.go:117] "RemoveContainer" containerID="625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.876831 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpddk\" (UniqueName: \"kubernetes.io/projected/0e05f53f-1275-42b8-8d25-1b6f96be0121-kube-api-access-qpddk\") pod \"0e05f53f-1275-42b8-8d25-1b6f96be0121\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.876912 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-client-ca\") pod \"0e05f53f-1275-42b8-8d25-1b6f96be0121\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.876986 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-config\") pod \"0e05f53f-1275-42b8-8d25-1b6f96be0121\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.877019 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e05f53f-1275-42b8-8d25-1b6f96be0121-serving-cert\") pod \"0e05f53f-1275-42b8-8d25-1b6f96be0121\" (UID: \"0e05f53f-1275-42b8-8d25-1b6f96be0121\") " Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.877305 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.877327 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.877338 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e05a47eb-0468-48cb-9e4e-19156acdda3a-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.877349 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e05a47eb-0468-48cb-9e4e-19156acdda3a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.877359 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd7ht\" (UniqueName: \"kubernetes.io/projected/e05a47eb-0468-48cb-9e4e-19156acdda3a-kube-api-access-dd7ht\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.878329 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-client-ca" (OuterVolumeSpecName: "client-ca") pod "0e05f53f-1275-42b8-8d25-1b6f96be0121" (UID: "0e05f53f-1275-42b8-8d25-1b6f96be0121"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.878442 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-config" (OuterVolumeSpecName: "config") pod "0e05f53f-1275-42b8-8d25-1b6f96be0121" (UID: "0e05f53f-1275-42b8-8d25-1b6f96be0121"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.880504 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f75rz"] Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.883967 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-f75rz"] Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.884461 4749 scope.go:117] "RemoveContainer" containerID="625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.884755 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e05f53f-1275-42b8-8d25-1b6f96be0121-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0e05f53f-1275-42b8-8d25-1b6f96be0121" (UID: "0e05f53f-1275-42b8-8d25-1b6f96be0121"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: E1129 01:16:04.885248 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c\": container with ID starting with 625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c not found: ID does not exist" containerID="625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.885290 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c"} err="failed to get container status \"625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c\": rpc error: code = NotFound desc = could not find container \"625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c\": container with ID starting with 625a6fa3348f14411898c06872d43f82431d14bb5791e9f369a52bb91023222c not found: ID does not exist" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.885708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e05f53f-1275-42b8-8d25-1b6f96be0121-kube-api-access-qpddk" (OuterVolumeSpecName: "kube-api-access-qpddk") pod "0e05f53f-1275-42b8-8d25-1b6f96be0121" (UID: "0e05f53f-1275-42b8-8d25-1b6f96be0121"). InnerVolumeSpecName "kube-api-access-qpddk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.979716 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpddk\" (UniqueName: \"kubernetes.io/projected/0e05f53f-1275-42b8-8d25-1b6f96be0121-kube-api-access-qpddk\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.979805 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.979830 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e05f53f-1275-42b8-8d25-1b6f96be0121-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:04 crc kubenswrapper[4749]: I1129 01:16:04.979852 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e05f53f-1275-42b8-8d25-1b6f96be0121-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:05 crc kubenswrapper[4749]: I1129 01:16:05.086135 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05a47eb-0468-48cb-9e4e-19156acdda3a" path="/var/lib/kubelet/pods/e05a47eb-0468-48cb-9e4e-19156acdda3a/volumes" Nov 29 01:16:05 crc kubenswrapper[4749]: I1129 01:16:05.167556 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz"] Nov 29 01:16:05 crc kubenswrapper[4749]: I1129 01:16:05.171409 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wv4pz"] Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.161502 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn"] Nov 29 01:16:06 crc kubenswrapper[4749]: E1129 01:16:06.166218 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e05f53f-1275-42b8-8d25-1b6f96be0121" containerName="route-controller-manager" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.166326 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e05f53f-1275-42b8-8d25-1b6f96be0121" containerName="route-controller-manager" Nov 29 01:16:06 crc kubenswrapper[4749]: E1129 01:16:06.166418 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05a47eb-0468-48cb-9e4e-19156acdda3a" containerName="controller-manager" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.166491 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05a47eb-0468-48cb-9e4e-19156acdda3a" containerName="controller-manager" Nov 29 01:16:06 crc kubenswrapper[4749]: E1129 01:16:06.166591 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.166666 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 01:16:06 crc kubenswrapper[4749]: E1129 01:16:06.166843 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e79e70c-cf41-46f1-9df6-f13b5ff21f63" containerName="collect-profiles" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.166920 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e79e70c-cf41-46f1-9df6-f13b5ff21f63" containerName="collect-profiles" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.167172 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.167281 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e05f53f-1275-42b8-8d25-1b6f96be0121" containerName="route-controller-manager" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.167377 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e79e70c-cf41-46f1-9df6-f13b5ff21f63" containerName="collect-profiles" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.167463 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05a47eb-0468-48cb-9e4e-19156acdda3a" containerName="controller-manager" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.168678 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.173246 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.173298 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.173353 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.173633 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74dc647949-x46rz"] Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.174260 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.175344 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.176743 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.177413 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.178298 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.179728 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.180118 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.181463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn"] Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.181945 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.183371 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.187773 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74dc647949-x46rz"] Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.203990 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.204876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3862513-411a-4266-bafd-85da82c040a2-serving-cert\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.205025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-client-ca\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.205101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-config\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.205262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwflq\" (UniqueName: \"kubernetes.io/projected/e3862513-411a-4266-bafd-85da82c040a2-kube-api-access-fwflq\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.211295 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3862513-411a-4266-bafd-85da82c040a2-serving-cert\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qqd\" (UniqueName: \"kubernetes.io/projected/b7fbb033-3403-4ee7-bda9-e9b269ad305f-kube-api-access-26qqd\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-client-ca\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-config\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-client-ca\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-config\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwflq\" (UniqueName: \"kubernetes.io/projected/e3862513-411a-4266-bafd-85da82c040a2-kube-api-access-fwflq\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7fbb033-3403-4ee7-bda9-e9b269ad305f-serving-cert\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.307942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-proxy-ca-bundles\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.309830 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-config\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.309878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-client-ca\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.317505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3862513-411a-4266-bafd-85da82c040a2-serving-cert\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.331153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwflq\" (UniqueName: \"kubernetes.io/projected/e3862513-411a-4266-bafd-85da82c040a2-kube-api-access-fwflq\") pod \"route-controller-manager-5ccd555bf-dwppn\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.409728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-client-ca\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.409811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-config\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.409859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7fbb033-3403-4ee7-bda9-e9b269ad305f-serving-cert\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.409888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-proxy-ca-bundles\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.409938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26qqd\" (UniqueName: \"kubernetes.io/projected/b7fbb033-3403-4ee7-bda9-e9b269ad305f-kube-api-access-26qqd\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.411075 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-client-ca\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.411997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-config\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.413261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-proxy-ca-bundles\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.415859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7fbb033-3403-4ee7-bda9-e9b269ad305f-serving-cert\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.438809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qqd\" (UniqueName: \"kubernetes.io/projected/b7fbb033-3403-4ee7-bda9-e9b269ad305f-kube-api-access-26qqd\") pod \"controller-manager-74dc647949-x46rz\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.530323 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.549715 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.827515 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74dc647949-x46rz"] Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.867966 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn"] Nov 29 01:16:06 crc kubenswrapper[4749]: I1129 01:16:06.869121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" event={"ID":"b7fbb033-3403-4ee7-bda9-e9b269ad305f","Type":"ContainerStarted","Data":"760d56a3031604ae49cefee8ab646216bc65b6d379412ec86dd6896c2dc9bc31"} Nov 29 01:16:06 crc kubenswrapper[4749]: W1129 01:16:06.872034 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3862513_411a_4266_bafd_85da82c040a2.slice/crio-8b1e8c38b8270664b3c9748dd42597e57afa8b63cbbd3f2323cbe204cd55bc2a WatchSource:0}: Error finding container 8b1e8c38b8270664b3c9748dd42597e57afa8b63cbbd3f2323cbe204cd55bc2a: Status 404 returned error can't find the container with id 8b1e8c38b8270664b3c9748dd42597e57afa8b63cbbd3f2323cbe204cd55bc2a Nov 29 01:16:07 crc kubenswrapper[4749]: I1129 01:16:07.082605 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e05f53f-1275-42b8-8d25-1b6f96be0121" path="/var/lib/kubelet/pods/0e05f53f-1275-42b8-8d25-1b6f96be0121/volumes" Nov 29 01:16:07 crc kubenswrapper[4749]: I1129 01:16:07.880404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" event={"ID":"e3862513-411a-4266-bafd-85da82c040a2","Type":"ContainerStarted","Data":"b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678"} Nov 29 01:16:07 crc kubenswrapper[4749]: I1129 01:16:07.881439 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:07 crc kubenswrapper[4749]: I1129 01:16:07.881474 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" event={"ID":"e3862513-411a-4266-bafd-85da82c040a2","Type":"ContainerStarted","Data":"8b1e8c38b8270664b3c9748dd42597e57afa8b63cbbd3f2323cbe204cd55bc2a"} Nov 29 01:16:07 crc kubenswrapper[4749]: I1129 01:16:07.883454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" event={"ID":"b7fbb033-3403-4ee7-bda9-e9b269ad305f","Type":"ContainerStarted","Data":"94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21"} Nov 29 01:16:07 crc kubenswrapper[4749]: I1129 01:16:07.891474 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:07 crc kubenswrapper[4749]: I1129 01:16:07.920686 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" podStartSLOduration=3.920666469 podStartE2EDuration="3.920666469s" podCreationTimestamp="2025-11-29 01:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:16:07.917487909 +0000 UTC m=+311.089637776" watchObservedRunningTime="2025-11-29 01:16:07.920666469 +0000 UTC m=+311.092816326" Nov 29 01:16:07 crc kubenswrapper[4749]: I1129 01:16:07.952496 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" podStartSLOduration=3.9524626449999998 podStartE2EDuration="3.952462645s" podCreationTimestamp="2025-11-29 01:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:16:07.944252803 +0000 UTC m=+311.116402740" watchObservedRunningTime="2025-11-29 01:16:07.952462645 +0000 UTC m=+311.124612542" Nov 29 01:16:08 crc kubenswrapper[4749]: I1129 01:16:08.039420 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74dc647949-x46rz"] Nov 29 01:16:08 crc kubenswrapper[4749]: I1129 01:16:08.890749 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:08 crc kubenswrapper[4749]: I1129 01:16:08.896699 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:09 crc kubenswrapper[4749]: I1129 01:16:09.895376 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" podUID="b7fbb033-3403-4ee7-bda9-e9b269ad305f" containerName="controller-manager" containerID="cri-o://94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21" gracePeriod=30 Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.467788 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.498699 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54f65f5676-668jx"] Nov 29 01:16:10 crc kubenswrapper[4749]: E1129 01:16:10.499104 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7fbb033-3403-4ee7-bda9-e9b269ad305f" containerName="controller-manager" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.499126 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7fbb033-3403-4ee7-bda9-e9b269ad305f" containerName="controller-manager" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.499419 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7fbb033-3403-4ee7-bda9-e9b269ad305f" containerName="controller-manager" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.500328 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.506999 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54f65f5676-668jx"] Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.577458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-client-ca\") pod \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.577532 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26qqd\" (UniqueName: \"kubernetes.io/projected/b7fbb033-3403-4ee7-bda9-e9b269ad305f-kube-api-access-26qqd\") pod \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.577602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-proxy-ca-bundles\") pod \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.577630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7fbb033-3403-4ee7-bda9-e9b269ad305f-serving-cert\") pod \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.577989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-config\") pod \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\" (UID: \"b7fbb033-3403-4ee7-bda9-e9b269ad305f\") " Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578303 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b7fbb033-3403-4ee7-bda9-e9b269ad305f" (UID: "b7fbb033-3403-4ee7-bda9-e9b269ad305f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578315 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-proxy-ca-bundles\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-config\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578484 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b7fbb033-3403-4ee7-bda9-e9b269ad305f" (UID: "b7fbb033-3403-4ee7-bda9-e9b269ad305f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9vq9\" (UniqueName: \"kubernetes.io/projected/f00ac192-8da8-4e73-8846-06586b9fd666-kube-api-access-l9vq9\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00ac192-8da8-4e73-8846-06586b9fd666-serving-cert\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-client-ca\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578773 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-config" (OuterVolumeSpecName: "config") pod "b7fbb033-3403-4ee7-bda9-e9b269ad305f" (UID: "b7fbb033-3403-4ee7-bda9-e9b269ad305f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.578982 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.579008 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.579021 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b7fbb033-3403-4ee7-bda9-e9b269ad305f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.594318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7fbb033-3403-4ee7-bda9-e9b269ad305f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b7fbb033-3403-4ee7-bda9-e9b269ad305f" (UID: "b7fbb033-3403-4ee7-bda9-e9b269ad305f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.594343 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7fbb033-3403-4ee7-bda9-e9b269ad305f-kube-api-access-26qqd" (OuterVolumeSpecName: "kube-api-access-26qqd") pod "b7fbb033-3403-4ee7-bda9-e9b269ad305f" (UID: "b7fbb033-3403-4ee7-bda9-e9b269ad305f"). InnerVolumeSpecName "kube-api-access-26qqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.679543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-config\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.679604 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9vq9\" (UniqueName: \"kubernetes.io/projected/f00ac192-8da8-4e73-8846-06586b9fd666-kube-api-access-l9vq9\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.679633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00ac192-8da8-4e73-8846-06586b9fd666-serving-cert\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.679656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-client-ca\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.679687 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-proxy-ca-bundles\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.679749 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7fbb033-3403-4ee7-bda9-e9b269ad305f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.679761 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26qqd\" (UniqueName: \"kubernetes.io/projected/b7fbb033-3403-4ee7-bda9-e9b269ad305f-kube-api-access-26qqd\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.680922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-client-ca\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.681028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-proxy-ca-bundles\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.681471 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00ac192-8da8-4e73-8846-06586b9fd666-config\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.684614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00ac192-8da8-4e73-8846-06586b9fd666-serving-cert\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.700378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9vq9\" (UniqueName: \"kubernetes.io/projected/f00ac192-8da8-4e73-8846-06586b9fd666-kube-api-access-l9vq9\") pod \"controller-manager-54f65f5676-668jx\" (UID: \"f00ac192-8da8-4e73-8846-06586b9fd666\") " pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.816333 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.902519 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7fbb033-3403-4ee7-bda9-e9b269ad305f" containerID="94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21" exitCode=0 Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.902597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" event={"ID":"b7fbb033-3403-4ee7-bda9-e9b269ad305f","Type":"ContainerDied","Data":"94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21"} Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.902662 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.902698 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dc647949-x46rz" event={"ID":"b7fbb033-3403-4ee7-bda9-e9b269ad305f","Type":"ContainerDied","Data":"760d56a3031604ae49cefee8ab646216bc65b6d379412ec86dd6896c2dc9bc31"} Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.902730 4749 scope.go:117] "RemoveContainer" containerID="94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.925901 4749 scope.go:117] "RemoveContainer" containerID="94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21" Nov 29 01:16:10 crc kubenswrapper[4749]: E1129 01:16:10.926546 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21\": container with ID starting with 94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21 not found: ID does not exist" containerID="94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.926614 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21"} err="failed to get container status \"94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21\": rpc error: code = NotFound desc = could not find container \"94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21\": container with ID starting with 94bda5d4a455ab1ceb2d7253e2c065d7f7543bfe9dc85c3eac4b9a0ea2f12b21 not found: ID does not exist" Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.938067 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74dc647949-x46rz"] Nov 29 01:16:10 crc kubenswrapper[4749]: I1129 01:16:10.941156 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74dc647949-x46rz"] Nov 29 01:16:11 crc kubenswrapper[4749]: I1129 01:16:11.089036 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7fbb033-3403-4ee7-bda9-e9b269ad305f" path="/var/lib/kubelet/pods/b7fbb033-3403-4ee7-bda9-e9b269ad305f/volumes" Nov 29 01:16:11 crc kubenswrapper[4749]: I1129 01:16:11.107866 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54f65f5676-668jx"] Nov 29 01:16:11 crc kubenswrapper[4749]: W1129 01:16:11.123582 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf00ac192_8da8_4e73_8846_06586b9fd666.slice/crio-f949112100e6a767fc3dac9556989bb8ebbfbdc73b07dd776312984c38928ab8 WatchSource:0}: Error finding container f949112100e6a767fc3dac9556989bb8ebbfbdc73b07dd776312984c38928ab8: Status 404 returned error can't find the container with id f949112100e6a767fc3dac9556989bb8ebbfbdc73b07dd776312984c38928ab8 Nov 29 01:16:11 crc kubenswrapper[4749]: I1129 01:16:11.914114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" event={"ID":"f00ac192-8da8-4e73-8846-06586b9fd666","Type":"ContainerStarted","Data":"6d92bde3277289387822286b34b9b1985ec0b0980106a2ee5fe844ebd613db29"} Nov 29 01:16:11 crc kubenswrapper[4749]: I1129 01:16:11.914634 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:11 crc kubenswrapper[4749]: I1129 01:16:11.914651 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" event={"ID":"f00ac192-8da8-4e73-8846-06586b9fd666","Type":"ContainerStarted","Data":"f949112100e6a767fc3dac9556989bb8ebbfbdc73b07dd776312984c38928ab8"} Nov 29 01:16:11 crc kubenswrapper[4749]: I1129 01:16:11.919629 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" Nov 29 01:16:11 crc kubenswrapper[4749]: I1129 01:16:11.944657 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54f65f5676-668jx" podStartSLOduration=3.944620606 podStartE2EDuration="3.944620606s" podCreationTimestamp="2025-11-29 01:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:16:11.935455617 +0000 UTC m=+315.107605524" watchObservedRunningTime="2025-11-29 01:16:11.944620606 +0000 UTC m=+315.116770493" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.260285 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn"] Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.261994 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" podUID="e3862513-411a-4266-bafd-85da82c040a2" containerName="route-controller-manager" containerID="cri-o://b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678" gracePeriod=30 Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.751972 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.823778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3862513-411a-4266-bafd-85da82c040a2-serving-cert\") pod \"e3862513-411a-4266-bafd-85da82c040a2\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.825147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwflq\" (UniqueName: \"kubernetes.io/projected/e3862513-411a-4266-bafd-85da82c040a2-kube-api-access-fwflq\") pod \"e3862513-411a-4266-bafd-85da82c040a2\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.825269 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-config\") pod \"e3862513-411a-4266-bafd-85da82c040a2\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.825378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-client-ca\") pod \"e3862513-411a-4266-bafd-85da82c040a2\" (UID: \"e3862513-411a-4266-bafd-85da82c040a2\") " Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.826180 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-config" (OuterVolumeSpecName: "config") pod "e3862513-411a-4266-bafd-85da82c040a2" (UID: "e3862513-411a-4266-bafd-85da82c040a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.826692 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-client-ca" (OuterVolumeSpecName: "client-ca") pod "e3862513-411a-4266-bafd-85da82c040a2" (UID: "e3862513-411a-4266-bafd-85da82c040a2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.833778 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3862513-411a-4266-bafd-85da82c040a2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e3862513-411a-4266-bafd-85da82c040a2" (UID: "e3862513-411a-4266-bafd-85da82c040a2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.835030 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3862513-411a-4266-bafd-85da82c040a2-kube-api-access-fwflq" (OuterVolumeSpecName: "kube-api-access-fwflq") pod "e3862513-411a-4266-bafd-85da82c040a2" (UID: "e3862513-411a-4266-bafd-85da82c040a2"). InnerVolumeSpecName "kube-api-access-fwflq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.927848 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwflq\" (UniqueName: \"kubernetes.io/projected/e3862513-411a-4266-bafd-85da82c040a2-kube-api-access-fwflq\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.927907 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.927927 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3862513-411a-4266-bafd-85da82c040a2-client-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:44 crc kubenswrapper[4749]: I1129 01:16:44.927948 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3862513-411a-4266-bafd-85da82c040a2-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.156968 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3862513-411a-4266-bafd-85da82c040a2" containerID="b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678" exitCode=0 Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.157077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" event={"ID":"e3862513-411a-4266-bafd-85da82c040a2","Type":"ContainerDied","Data":"b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678"} Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.157148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" event={"ID":"e3862513-411a-4266-bafd-85da82c040a2","Type":"ContainerDied","Data":"8b1e8c38b8270664b3c9748dd42597e57afa8b63cbbd3f2323cbe204cd55bc2a"} Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.157190 4749 scope.go:117] "RemoveContainer" containerID="b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678" Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.159617 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn" Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.188636 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn"] Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.196812 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ccd555bf-dwppn"] Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.197745 4749 scope.go:117] "RemoveContainer" containerID="b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678" Nov 29 01:16:45 crc kubenswrapper[4749]: E1129 01:16:45.198685 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678\": container with ID starting with b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678 not found: ID does not exist" containerID="b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678" Nov 29 01:16:45 crc kubenswrapper[4749]: I1129 01:16:45.198773 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678"} err="failed to get container status \"b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678\": rpc error: code = NotFound desc = could not find container \"b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678\": container with ID starting with b4ef1ff0417961c85013ed1f128c4c9af484cf561639d30d5adbabf61966d678 not found: ID does not exist" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.210623 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq"] Nov 29 01:16:46 crc kubenswrapper[4749]: E1129 01:16:46.210910 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3862513-411a-4266-bafd-85da82c040a2" containerName="route-controller-manager" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.210926 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3862513-411a-4266-bafd-85da82c040a2" containerName="route-controller-manager" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.211044 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3862513-411a-4266-bafd-85da82c040a2" containerName="route-controller-manager" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.211539 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.215947 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.216004 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.216017 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.215962 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.216036 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.216496 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.236267 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq"] Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.253020 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z49pd\" (UniqueName: \"kubernetes.io/projected/1205502a-7602-438d-8fb8-180a32a3ff6e-kube-api-access-z49pd\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.253105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1205502a-7602-438d-8fb8-180a32a3ff6e-serving-cert\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.253154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1205502a-7602-438d-8fb8-180a32a3ff6e-config\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.253268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1205502a-7602-438d-8fb8-180a32a3ff6e-client-ca\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.354000 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1205502a-7602-438d-8fb8-180a32a3ff6e-client-ca\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.354093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z49pd\" (UniqueName: \"kubernetes.io/projected/1205502a-7602-438d-8fb8-180a32a3ff6e-kube-api-access-z49pd\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.354126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1205502a-7602-438d-8fb8-180a32a3ff6e-serving-cert\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.354154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1205502a-7602-438d-8fb8-180a32a3ff6e-config\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.355477 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1205502a-7602-438d-8fb8-180a32a3ff6e-config\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.355587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1205502a-7602-438d-8fb8-180a32a3ff6e-client-ca\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.361786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1205502a-7602-438d-8fb8-180a32a3ff6e-serving-cert\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.386393 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z49pd\" (UniqueName: \"kubernetes.io/projected/1205502a-7602-438d-8fb8-180a32a3ff6e-kube-api-access-z49pd\") pod \"route-controller-manager-7967b78d9f-m4fcq\" (UID: \"1205502a-7602-438d-8fb8-180a32a3ff6e\") " pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.530718 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:46 crc kubenswrapper[4749]: I1129 01:16:46.810784 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq"] Nov 29 01:16:47 crc kubenswrapper[4749]: I1129 01:16:47.090324 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3862513-411a-4266-bafd-85da82c040a2" path="/var/lib/kubelet/pods/e3862513-411a-4266-bafd-85da82c040a2/volumes" Nov 29 01:16:47 crc kubenswrapper[4749]: I1129 01:16:47.194253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" event={"ID":"1205502a-7602-438d-8fb8-180a32a3ff6e","Type":"ContainerStarted","Data":"c4eff72f016afbfd18fcafdd59e8df1c375ce2161da6cfa8b9082ce99e8352c6"} Nov 29 01:16:47 crc kubenswrapper[4749]: I1129 01:16:47.194351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" event={"ID":"1205502a-7602-438d-8fb8-180a32a3ff6e","Type":"ContainerStarted","Data":"356dc30c396e5fed7343446d3b5263cba59119121042560e2ac425607bc9c3d8"} Nov 29 01:16:47 crc kubenswrapper[4749]: I1129 01:16:47.194873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:47 crc kubenswrapper[4749]: I1129 01:16:47.226044 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" podStartSLOduration=3.226004178 podStartE2EDuration="3.226004178s" podCreationTimestamp="2025-11-29 01:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:16:47.222362307 +0000 UTC m=+350.394512164" watchObservedRunningTime="2025-11-29 01:16:47.226004178 +0000 UTC m=+350.398154065" Nov 29 01:16:47 crc kubenswrapper[4749]: I1129 01:16:47.640930 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7967b78d9f-m4fcq" Nov 29 01:16:55 crc kubenswrapper[4749]: I1129 01:16:55.374346 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:16:55 crc kubenswrapper[4749]: I1129 01:16:55.376390 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:17:10 crc kubenswrapper[4749]: I1129 01:17:10.953401 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bdtf6"] Nov 29 01:17:10 crc kubenswrapper[4749]: I1129 01:17:10.955978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:10 crc kubenswrapper[4749]: I1129 01:17:10.970383 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bdtf6"] Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.090773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-bound-sa-token\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.091713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-trusted-ca\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.091823 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-registry-tls\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.092718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.092860 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.093071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mdd\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-kube-api-access-r7mdd\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.093236 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-registry-certificates\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.093335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.131529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.194684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.194796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mdd\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-kube-api-access-r7mdd\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.194858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-registry-certificates\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.194917 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.194966 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-bound-sa-token\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.195008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-trusted-ca\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.195040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-registry-tls\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.195176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.197528 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-registry-certificates\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.197857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-trusted-ca\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.206718 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.206811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-registry-tls\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.220521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-bound-sa-token\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.222091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mdd\" (UniqueName: \"kubernetes.io/projected/06cc4e7c-5482-416f-a1d8-0a6182c6ff3b-kube-api-access-r7mdd\") pod \"image-registry-66df7c8f76-bdtf6\" (UID: \"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b\") " pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.285278 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:11 crc kubenswrapper[4749]: I1129 01:17:11.746375 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bdtf6"] Nov 29 01:17:12 crc kubenswrapper[4749]: I1129 01:17:12.418763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" event={"ID":"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b","Type":"ContainerStarted","Data":"b1ec1a87ac9185093807e280c7d9760161d81f4fd3a50322ac62e4cebdbdce1d"} Nov 29 01:17:12 crc kubenswrapper[4749]: I1129 01:17:12.419344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" event={"ID":"06cc4e7c-5482-416f-a1d8-0a6182c6ff3b","Type":"ContainerStarted","Data":"ee5c510110854208410b802b94c8db00fb245655b5b7d234cf65980fca0a40aa"} Nov 29 01:17:12 crc kubenswrapper[4749]: I1129 01:17:12.419366 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:12 crc kubenswrapper[4749]: I1129 01:17:12.453021 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" podStartSLOduration=2.452996922 podStartE2EDuration="2.452996922s" podCreationTimestamp="2025-11-29 01:17:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:17:12.448969531 +0000 UTC m=+375.621119448" watchObservedRunningTime="2025-11-29 01:17:12.452996922 +0000 UTC m=+375.625146789" Nov 29 01:17:25 crc kubenswrapper[4749]: I1129 01:17:25.374616 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:17:25 crc kubenswrapper[4749]: I1129 01:17:25.375885 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:17:31 crc kubenswrapper[4749]: I1129 01:17:31.294347 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bdtf6" Nov 29 01:17:31 crc kubenswrapper[4749]: I1129 01:17:31.374114 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkqth"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.708397 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rwk9"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.713693 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qr8r9"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.715933 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5rwk9" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerName="registry-server" containerID="cri-o://50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad" gracePeriod=30 Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.716327 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qr8r9" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerName="registry-server" containerID="cri-o://0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab" gracePeriod=30 Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.719529 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47pxv"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.719858 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" containerID="cri-o://7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495" gracePeriod=30 Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.765660 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw8rb"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.769909 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hw8rb" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="registry-server" containerID="cri-o://a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e" gracePeriod=30 Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.781187 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m564m"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.787869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.813572 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-hw8rb" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="registry-server" probeResult="failure" output="" Nov 29 01:17:50 crc kubenswrapper[4749]: E1129 01:17:50.814025 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: EOF, stdout: , stderr: , exit code -1" containerID="a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.814145 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2rn4"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.814528 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c2rn4" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="registry-server" containerID="cri-o://1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958" gracePeriod=30 Nov 29 01:17:50 crc kubenswrapper[4749]: E1129 01:17:50.817629 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e is running failed: container process not found" containerID="a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 01:17:50 crc kubenswrapper[4749]: E1129 01:17:50.818018 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e is running failed: container process not found" containerID="a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e" cmd=["grpc_health_probe","-addr=:50051"] Nov 29 01:17:50 crc kubenswrapper[4749]: E1129 01:17:50.818042 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-hw8rb" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="registry-server" Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.823958 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m564m"] Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.884247 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b11e2f5-ae7c-4297-97cb-e217d0947051-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.884309 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1b11e2f5-ae7c-4297-97cb-e217d0947051-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.884355 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzs82\" (UniqueName: \"kubernetes.io/projected/1b11e2f5-ae7c-4297-97cb-e217d0947051-kube-api-access-qzs82\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.986608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzs82\" (UniqueName: \"kubernetes.io/projected/1b11e2f5-ae7c-4297-97cb-e217d0947051-kube-api-access-qzs82\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.986762 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b11e2f5-ae7c-4297-97cb-e217d0947051-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.986805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1b11e2f5-ae7c-4297-97cb-e217d0947051-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:50 crc kubenswrapper[4749]: I1129 01:17:50.989097 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b11e2f5-ae7c-4297-97cb-e217d0947051-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.001961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1b11e2f5-ae7c-4297-97cb-e217d0947051-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.008635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzs82\" (UniqueName: \"kubernetes.io/projected/1b11e2f5-ae7c-4297-97cb-e217d0947051-kube-api-access-qzs82\") pod \"marketplace-operator-79b997595-m564m\" (UID: \"1b11e2f5-ae7c-4297-97cb-e217d0947051\") " pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.171428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.176582 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.241963 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.254661 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.261400 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.292615 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-utilities\") pod \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.294552 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhpq2\" (UniqueName: \"kubernetes.io/projected/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-kube-api-access-lhpq2\") pod \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.294665 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-catalog-content\") pod \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\" (UID: \"574f9eb6-8ce6-4f90-8e38-47be16ec96d1\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.294725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56lrj\" (UniqueName: \"kubernetes.io/projected/c6e1d685-7223-41bd-b180-ee357d754e89-kube-api-access-56lrj\") pod \"c6e1d685-7223-41bd-b180-ee357d754e89\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.294806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-utilities\") pod \"1dd82406-f875-4ec7-bbe9-8424b2725f51\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.294847 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6cj8\" (UniqueName: \"kubernetes.io/projected/1dd82406-f875-4ec7-bbe9-8424b2725f51-kube-api-access-p6cj8\") pod \"1dd82406-f875-4ec7-bbe9-8424b2725f51\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.294882 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-utilities\") pod \"c6e1d685-7223-41bd-b180-ee357d754e89\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.294938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-operator-metrics\") pod \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.294974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-catalog-content\") pod \"1dd82406-f875-4ec7-bbe9-8424b2725f51\" (UID: \"1dd82406-f875-4ec7-bbe9-8424b2725f51\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.296844 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-utilities" (OuterVolumeSpecName: "utilities") pod "1dd82406-f875-4ec7-bbe9-8424b2725f51" (UID: "1dd82406-f875-4ec7-bbe9-8424b2725f51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.300310 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-utilities" (OuterVolumeSpecName: "utilities") pod "c6e1d685-7223-41bd-b180-ee357d754e89" (UID: "c6e1d685-7223-41bd-b180-ee357d754e89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.304701 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-kube-api-access-lhpq2" (OuterVolumeSpecName: "kube-api-access-lhpq2") pod "574f9eb6-8ce6-4f90-8e38-47be16ec96d1" (UID: "574f9eb6-8ce6-4f90-8e38-47be16ec96d1"). InnerVolumeSpecName "kube-api-access-lhpq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.304788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e1d685-7223-41bd-b180-ee357d754e89-kube-api-access-56lrj" (OuterVolumeSpecName: "kube-api-access-56lrj") pod "c6e1d685-7223-41bd-b180-ee357d754e89" (UID: "c6e1d685-7223-41bd-b180-ee357d754e89"). InnerVolumeSpecName "kube-api-access-56lrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.307623 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d34de2ad-4a60-49b0-b63b-5f610370bbd4" (UID: "d34de2ad-4a60-49b0-b63b-5f610370bbd4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.311651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-utilities" (OuterVolumeSpecName: "utilities") pod "574f9eb6-8ce6-4f90-8e38-47be16ec96d1" (UID: "574f9eb6-8ce6-4f90-8e38-47be16ec96d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.320029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd82406-f875-4ec7-bbe9-8424b2725f51-kube-api-access-p6cj8" (OuterVolumeSpecName: "kube-api-access-p6cj8") pod "1dd82406-f875-4ec7-bbe9-8424b2725f51" (UID: "1dd82406-f875-4ec7-bbe9-8424b2725f51"). InnerVolumeSpecName "kube-api-access-p6cj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.365173 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.382296 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dd82406-f875-4ec7-bbe9-8424b2725f51" (UID: "1dd82406-f875-4ec7-bbe9-8424b2725f51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.382361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "574f9eb6-8ce6-4f90-8e38-47be16ec96d1" (UID: "574f9eb6-8ce6-4f90-8e38-47be16ec96d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.397037 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-catalog-content\") pod \"c6e1d685-7223-41bd-b180-ee357d754e89\" (UID: \"c6e1d685-7223-41bd-b180-ee357d754e89\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.397090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-trusted-ca\") pod \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.397119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fplk5\" (UniqueName: \"kubernetes.io/projected/d34de2ad-4a60-49b0-b63b-5f610370bbd4-kube-api-access-fplk5\") pod \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\" (UID: \"d34de2ad-4a60-49b0-b63b-5f610370bbd4\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399221 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-utilities\") pod \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz6v4\" (UniqueName: \"kubernetes.io/projected/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-kube-api-access-bz6v4\") pod \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399312 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-catalog-content\") pod \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\" (UID: \"deff97e4-7feb-44d1-8f74-b1b5bd302b9e\") " Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399376 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d34de2ad-4a60-49b0-b63b-5f610370bbd4" (UID: "d34de2ad-4a60-49b0-b63b-5f610370bbd4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399632 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6cj8\" (UniqueName: \"kubernetes.io/projected/1dd82406-f875-4ec7-bbe9-8424b2725f51-kube-api-access-p6cj8\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399655 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399666 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399677 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399689 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d34de2ad-4a60-49b0-b63b-5f610370bbd4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399699 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399709 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhpq2\" (UniqueName: \"kubernetes.io/projected/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-kube-api-access-lhpq2\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399719 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574f9eb6-8ce6-4f90-8e38-47be16ec96d1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399730 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56lrj\" (UniqueName: \"kubernetes.io/projected/c6e1d685-7223-41bd-b180-ee357d754e89-kube-api-access-56lrj\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.399739 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dd82406-f875-4ec7-bbe9-8424b2725f51-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.400186 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-utilities" (OuterVolumeSpecName: "utilities") pod "deff97e4-7feb-44d1-8f74-b1b5bd302b9e" (UID: "deff97e4-7feb-44d1-8f74-b1b5bd302b9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.404567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34de2ad-4a60-49b0-b63b-5f610370bbd4-kube-api-access-fplk5" (OuterVolumeSpecName: "kube-api-access-fplk5") pod "d34de2ad-4a60-49b0-b63b-5f610370bbd4" (UID: "d34de2ad-4a60-49b0-b63b-5f610370bbd4"). InnerVolumeSpecName "kube-api-access-fplk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.405869 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-kube-api-access-bz6v4" (OuterVolumeSpecName: "kube-api-access-bz6v4") pod "deff97e4-7feb-44d1-8f74-b1b5bd302b9e" (UID: "deff97e4-7feb-44d1-8f74-b1b5bd302b9e"). InnerVolumeSpecName "kube-api-access-bz6v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.418108 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6e1d685-7223-41bd-b180-ee357d754e89" (UID: "c6e1d685-7223-41bd-b180-ee357d754e89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.501732 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6e1d685-7223-41bd-b180-ee357d754e89-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.501779 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fplk5\" (UniqueName: \"kubernetes.io/projected/d34de2ad-4a60-49b0-b63b-5f610370bbd4-kube-api-access-fplk5\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.501794 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.501804 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz6v4\" (UniqueName: \"kubernetes.io/projected/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-kube-api-access-bz6v4\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.522437 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "deff97e4-7feb-44d1-8f74-b1b5bd302b9e" (UID: "deff97e4-7feb-44d1-8f74-b1b5bd302b9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.603798 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deff97e4-7feb-44d1-8f74-b1b5bd302b9e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.699705 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-m564m"] Nov 29 01:17:51 crc kubenswrapper[4749]: W1129 01:17:51.714492 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b11e2f5_ae7c_4297_97cb_e217d0947051.slice/crio-c4afc4b426fbbdb122c909d0a3c99b11c7e32d098943c313a6d83ea395812dac WatchSource:0}: Error finding container c4afc4b426fbbdb122c909d0a3c99b11c7e32d098943c313a6d83ea395812dac: Status 404 returned error can't find the container with id c4afc4b426fbbdb122c909d0a3c99b11c7e32d098943c313a6d83ea395812dac Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.790111 4749 generic.go:334] "Generic (PLEG): container finished" podID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerID="0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab" exitCode=0 Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.790242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr8r9" event={"ID":"1dd82406-f875-4ec7-bbe9-8424b2725f51","Type":"ContainerDied","Data":"0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.790291 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qr8r9" event={"ID":"1dd82406-f875-4ec7-bbe9-8424b2725f51","Type":"ContainerDied","Data":"07d62d0b609c2270412b4eac3a202d2d200b78a1d479c13cd9180c95d8d9ecf2"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.790323 4749 scope.go:117] "RemoveContainer" containerID="0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.790513 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qr8r9" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.800726 4749 generic.go:334] "Generic (PLEG): container finished" podID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerID="50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad" exitCode=0 Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.800877 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5rwk9" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.801311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwk9" event={"ID":"574f9eb6-8ce6-4f90-8e38-47be16ec96d1","Type":"ContainerDied","Data":"50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.801703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5rwk9" event={"ID":"574f9eb6-8ce6-4f90-8e38-47be16ec96d1","Type":"ContainerDied","Data":"8b87e758bac0ee2acdc1b66d8438ec72517fb4ad79b82aaa1c5116128a7f68d8"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.805447 4749 generic.go:334] "Generic (PLEG): container finished" podID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerID="1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958" exitCode=0 Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.805544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rn4" event={"ID":"deff97e4-7feb-44d1-8f74-b1b5bd302b9e","Type":"ContainerDied","Data":"1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.805558 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2rn4" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.805583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2rn4" event={"ID":"deff97e4-7feb-44d1-8f74-b1b5bd302b9e","Type":"ContainerDied","Data":"690343b879f624738b826f7d5e5c9e3881d624e68fa34fc2a6fe0d402f87c3d2"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.808587 4749 generic.go:334] "Generic (PLEG): container finished" podID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerID="7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495" exitCode=0 Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.808626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" event={"ID":"d34de2ad-4a60-49b0-b63b-5f610370bbd4","Type":"ContainerDied","Data":"7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.808655 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.808666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47pxv" event={"ID":"d34de2ad-4a60-49b0-b63b-5f610370bbd4","Type":"ContainerDied","Data":"7277fa7bfb7a891f31b6f55e24826af3efd14671526c72a089706a3acbe937c3"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.810897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m564m" event={"ID":"1b11e2f5-ae7c-4297-97cb-e217d0947051","Type":"ContainerStarted","Data":"c4afc4b426fbbdb122c909d0a3c99b11c7e32d098943c313a6d83ea395812dac"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.813070 4749 scope.go:117] "RemoveContainer" containerID="a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.821817 4749 generic.go:334] "Generic (PLEG): container finished" podID="c6e1d685-7223-41bd-b180-ee357d754e89" containerID="a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e" exitCode=0 Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.821883 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw8rb" event={"ID":"c6e1d685-7223-41bd-b180-ee357d754e89","Type":"ContainerDied","Data":"a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.821925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw8rb" event={"ID":"c6e1d685-7223-41bd-b180-ee357d754e89","Type":"ContainerDied","Data":"23282cc5f3bfd955a34bfbb16401a64391413033269811f6470c93fbf695c7ea"} Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.822017 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw8rb" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.844589 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qr8r9"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.861259 4749 scope.go:117] "RemoveContainer" containerID="5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.864825 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qr8r9"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.876435 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2rn4"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.889286 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c2rn4"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.903532 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw8rb"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.906697 4749 scope.go:117] "RemoveContainer" containerID="0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab" Nov 29 01:17:51 crc kubenswrapper[4749]: E1129 01:17:51.909448 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab\": container with ID starting with 0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab not found: ID does not exist" containerID="0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.909510 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab"} err="failed to get container status \"0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab\": rpc error: code = NotFound desc = could not find container \"0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab\": container with ID starting with 0df1784e9bb990658daf088148fb3a51ea6708977ec43b39f50e4503862061ab not found: ID does not exist" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.909529 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw8rb"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.909545 4749 scope.go:117] "RemoveContainer" containerID="a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a" Nov 29 01:17:51 crc kubenswrapper[4749]: E1129 01:17:51.910971 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a\": container with ID starting with a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a not found: ID does not exist" containerID="a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.911042 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a"} err="failed to get container status \"a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a\": rpc error: code = NotFound desc = could not find container \"a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a\": container with ID starting with a1956326237f9d580bcf153d031c700dcca8e95785a0d73da483a5e2dd10740a not found: ID does not exist" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.911119 4749 scope.go:117] "RemoveContainer" containerID="5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4" Nov 29 01:17:51 crc kubenswrapper[4749]: E1129 01:17:51.911822 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4\": container with ID starting with 5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4 not found: ID does not exist" containerID="5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.911863 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5rwk9"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.911874 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4"} err="failed to get container status \"5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4\": rpc error: code = NotFound desc = could not find container \"5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4\": container with ID starting with 5f589ea3f8cc8d1f24a5e575b3291611f85c109b7590a76ea2946784addcbec4 not found: ID does not exist" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.911921 4749 scope.go:117] "RemoveContainer" containerID="50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.914653 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5rwk9"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.918585 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47pxv"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.939583 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47pxv"] Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.965091 4749 scope.go:117] "RemoveContainer" containerID="e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272" Nov 29 01:17:51 crc kubenswrapper[4749]: I1129 01:17:51.982681 4749 scope.go:117] "RemoveContainer" containerID="635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.003578 4749 scope.go:117] "RemoveContainer" containerID="50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.004186 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad\": container with ID starting with 50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad not found: ID does not exist" containerID="50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.004270 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad"} err="failed to get container status \"50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad\": rpc error: code = NotFound desc = could not find container \"50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad\": container with ID starting with 50177cfcc438c4b4f6010836f6d8a488e1a410952f65b781122d89c0154f44ad not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.004316 4749 scope.go:117] "RemoveContainer" containerID="e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.004887 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272\": container with ID starting with e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272 not found: ID does not exist" containerID="e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.004911 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272"} err="failed to get container status \"e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272\": rpc error: code = NotFound desc = could not find container \"e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272\": container with ID starting with e46dbd850e5c9367452e1aeeccfe2ad0c0b5d3e10be87b1213b3d618ba8dd272 not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.004925 4749 scope.go:117] "RemoveContainer" containerID="635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.005226 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d\": container with ID starting with 635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d not found: ID does not exist" containerID="635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.005254 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d"} err="failed to get container status \"635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d\": rpc error: code = NotFound desc = could not find container \"635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d\": container with ID starting with 635500ed800d8196d608482a4f57ee5c2ff82a4ee2f6dfaefc9750d6e068c70d not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.005269 4749 scope.go:117] "RemoveContainer" containerID="1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.019375 4749 scope.go:117] "RemoveContainer" containerID="672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.044437 4749 scope.go:117] "RemoveContainer" containerID="50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.063047 4749 scope.go:117] "RemoveContainer" containerID="1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.064245 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958\": container with ID starting with 1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958 not found: ID does not exist" containerID="1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.064290 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958"} err="failed to get container status \"1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958\": rpc error: code = NotFound desc = could not find container \"1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958\": container with ID starting with 1f77cad0cd91449412083cfd715ff52e0586f285a49479c613653c9a33143958 not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.064324 4749 scope.go:117] "RemoveContainer" containerID="672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.064749 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56\": container with ID starting with 672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56 not found: ID does not exist" containerID="672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.064783 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56"} err="failed to get container status \"672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56\": rpc error: code = NotFound desc = could not find container \"672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56\": container with ID starting with 672d4b9bde37e3773de22144dbe042c5a0b2482310d237cba5be5f6a05c5ff56 not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.064816 4749 scope.go:117] "RemoveContainer" containerID="50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.065072 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51\": container with ID starting with 50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51 not found: ID does not exist" containerID="50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.065102 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51"} err="failed to get container status \"50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51\": rpc error: code = NotFound desc = could not find container \"50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51\": container with ID starting with 50212862f6ed2f931426d89cd457eec5a363af7010a754d7ace80b7eaf719b51 not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.065128 4749 scope.go:117] "RemoveContainer" containerID="7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.080557 4749 scope.go:117] "RemoveContainer" containerID="b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.098617 4749 scope.go:117] "RemoveContainer" containerID="7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.099550 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495\": container with ID starting with 7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495 not found: ID does not exist" containerID="7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.099594 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495"} err="failed to get container status \"7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495\": rpc error: code = NotFound desc = could not find container \"7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495\": container with ID starting with 7e5ce923f926de4c1483cb92971856a53b8cd0b17ea524550d554fe7dafa6495 not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.099624 4749 scope.go:117] "RemoveContainer" containerID="b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.100131 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf\": container with ID starting with b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf not found: ID does not exist" containerID="b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.100162 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf"} err="failed to get container status \"b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf\": rpc error: code = NotFound desc = could not find container \"b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf\": container with ID starting with b6e50ec821c34c26535d51b5e5b1f135d37bef52e54d0d881eab8646df746bdf not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.100184 4749 scope.go:117] "RemoveContainer" containerID="a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.118816 4749 scope.go:117] "RemoveContainer" containerID="9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.141813 4749 scope.go:117] "RemoveContainer" containerID="953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.162135 4749 scope.go:117] "RemoveContainer" containerID="a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.162781 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e\": container with ID starting with a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e not found: ID does not exist" containerID="a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.162831 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e"} err="failed to get container status \"a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e\": rpc error: code = NotFound desc = could not find container \"a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e\": container with ID starting with a5be7a2a762c14134d057c7fe67a7be45d31bcd653a6485cbb67dcd492c0895e not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.162874 4749 scope.go:117] "RemoveContainer" containerID="9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.163354 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b\": container with ID starting with 9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b not found: ID does not exist" containerID="9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.163382 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b"} err="failed to get container status \"9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b\": rpc error: code = NotFound desc = could not find container \"9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b\": container with ID starting with 9060e6dc116ea56fa591ebe13c0fd038acbf399edeeeb52632004eedd954ab7b not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.163409 4749 scope.go:117] "RemoveContainer" containerID="953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.164985 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1\": container with ID starting with 953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1 not found: ID does not exist" containerID="953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.165014 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1"} err="failed to get container status \"953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1\": rpc error: code = NotFound desc = could not find container \"953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1\": container with ID starting with 953aa047fdb0ad512254ec4e4a14134eb0d8105d13a296ab9f483ad3234489a1 not found: ID does not exist" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.835009 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-m564m" event={"ID":"1b11e2f5-ae7c-4297-97cb-e217d0947051","Type":"ContainerStarted","Data":"3d7959354feb03bdb4e46949e03d349807ea948b19fefce2e12bb622aac180f7"} Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.836446 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.841112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-m564m" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.894953 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-m564m" podStartSLOduration=2.894913405 podStartE2EDuration="2.894913405s" podCreationTimestamp="2025-11-29 01:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:17:52.860406305 +0000 UTC m=+416.032556172" watchObservedRunningTime="2025-11-29 01:17:52.894913405 +0000 UTC m=+416.067063292" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941336 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzcw"] Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941711 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941736 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941759 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerName="extract-content" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941772 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerName="extract-content" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941786 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerName="extract-utilities" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941798 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerName="extract-utilities" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941810 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerName="extract-utilities" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941819 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerName="extract-utilities" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941831 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941839 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941849 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941857 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941871 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941880 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941893 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="extract-utilities" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941903 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="extract-utilities" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941915 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="extract-utilities" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941926 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="extract-utilities" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941936 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="extract-content" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941944 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="extract-content" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941957 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941966 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941980 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="extract-content" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.941988 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="extract-content" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.941999 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.942008 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" Nov 29 01:17:52 crc kubenswrapper[4749]: E1129 01:17:52.942018 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerName="extract-content" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.942025 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerName="extract-content" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.942156 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.942173 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.942188 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.942214 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.942226 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" containerName="registry-server" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.943983 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" containerName="marketplace-operator" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.948063 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzcw"] Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.948178 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:52 crc kubenswrapper[4749]: I1129 01:17:52.955463 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.022817 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-utilities\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.022875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcgtc\" (UniqueName: \"kubernetes.io/projected/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-kube-api-access-tcgtc\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.023154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-catalog-content\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.086489 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd82406-f875-4ec7-bbe9-8424b2725f51" path="/var/lib/kubelet/pods/1dd82406-f875-4ec7-bbe9-8424b2725f51/volumes" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.088426 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="574f9eb6-8ce6-4f90-8e38-47be16ec96d1" path="/var/lib/kubelet/pods/574f9eb6-8ce6-4f90-8e38-47be16ec96d1/volumes" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.089300 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e1d685-7223-41bd-b180-ee357d754e89" path="/var/lib/kubelet/pods/c6e1d685-7223-41bd-b180-ee357d754e89/volumes" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.090955 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34de2ad-4a60-49b0-b63b-5f610370bbd4" path="/var/lib/kubelet/pods/d34de2ad-4a60-49b0-b63b-5f610370bbd4/volumes" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.091691 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deff97e4-7feb-44d1-8f74-b1b5bd302b9e" path="/var/lib/kubelet/pods/deff97e4-7feb-44d1-8f74-b1b5bd302b9e/volumes" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.125538 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lk96f"] Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.127479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcgtc\" (UniqueName: \"kubernetes.io/projected/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-kube-api-access-tcgtc\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.127582 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.127754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-catalog-content\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.127866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-utilities\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.129212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-utilities\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.129221 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-catalog-content\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.141004 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.148161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lk96f"] Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.158936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcgtc\" (UniqueName: \"kubernetes.io/projected/b44c30b9-6b5c-40fd-9f73-0072f941ffeb-kube-api-access-tcgtc\") pod \"redhat-marketplace-rvzcw\" (UID: \"b44c30b9-6b5c-40fd-9f73-0072f941ffeb\") " pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.230138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef97226b-aa25-4088-a39b-0015a132dd8c-catalog-content\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.230334 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef97226b-aa25-4088-a39b-0015a132dd8c-utilities\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.230382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6znct\" (UniqueName: \"kubernetes.io/projected/ef97226b-aa25-4088-a39b-0015a132dd8c-kube-api-access-6znct\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.271947 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.330874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef97226b-aa25-4088-a39b-0015a132dd8c-catalog-content\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.331338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef97226b-aa25-4088-a39b-0015a132dd8c-utilities\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.331366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6znct\" (UniqueName: \"kubernetes.io/projected/ef97226b-aa25-4088-a39b-0015a132dd8c-kube-api-access-6znct\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.331631 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef97226b-aa25-4088-a39b-0015a132dd8c-catalog-content\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.331949 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef97226b-aa25-4088-a39b-0015a132dd8c-utilities\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.353323 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6znct\" (UniqueName: \"kubernetes.io/projected/ef97226b-aa25-4088-a39b-0015a132dd8c-kube-api-access-6znct\") pod \"redhat-operators-lk96f\" (UID: \"ef97226b-aa25-4088-a39b-0015a132dd8c\") " pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.462478 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:17:53 crc kubenswrapper[4749]: W1129 01:17:53.734782 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb44c30b9_6b5c_40fd_9f73_0072f941ffeb.slice/crio-2b3a85f677b298aff80ef5eb3d884053a01ac50494c7fe037e8ddd0cefd0b04a WatchSource:0}: Error finding container 2b3a85f677b298aff80ef5eb3d884053a01ac50494c7fe037e8ddd0cefd0b04a: Status 404 returned error can't find the container with id 2b3a85f677b298aff80ef5eb3d884053a01ac50494c7fe037e8ddd0cefd0b04a Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.735939 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvzcw"] Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.856461 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzcw" event={"ID":"b44c30b9-6b5c-40fd-9f73-0072f941ffeb","Type":"ContainerStarted","Data":"cf798473cef82d559891623902dd39a514322a5dbe8d5a26e99f26be97462421"} Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.856910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzcw" event={"ID":"b44c30b9-6b5c-40fd-9f73-0072f941ffeb","Type":"ContainerStarted","Data":"2b3a85f677b298aff80ef5eb3d884053a01ac50494c7fe037e8ddd0cefd0b04a"} Nov 29 01:17:53 crc kubenswrapper[4749]: I1129 01:17:53.889213 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lk96f"] Nov 29 01:17:53 crc kubenswrapper[4749]: W1129 01:17:53.954162 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef97226b_aa25_4088_a39b_0015a132dd8c.slice/crio-9e22912ac961882f9bafda4bc249faaec3d1f9250ff15992f7f39de5a6b7fb0b WatchSource:0}: Error finding container 9e22912ac961882f9bafda4bc249faaec3d1f9250ff15992f7f39de5a6b7fb0b: Status 404 returned error can't find the container with id 9e22912ac961882f9bafda4bc249faaec3d1f9250ff15992f7f39de5a6b7fb0b Nov 29 01:17:54 crc kubenswrapper[4749]: I1129 01:17:54.867984 4749 generic.go:334] "Generic (PLEG): container finished" podID="ef97226b-aa25-4088-a39b-0015a132dd8c" containerID="a871fc275395ae8d68629589e6fde3a933f151161f5602a966671708cfe8fe4a" exitCode=0 Nov 29 01:17:54 crc kubenswrapper[4749]: I1129 01:17:54.868115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lk96f" event={"ID":"ef97226b-aa25-4088-a39b-0015a132dd8c","Type":"ContainerDied","Data":"a871fc275395ae8d68629589e6fde3a933f151161f5602a966671708cfe8fe4a"} Nov 29 01:17:54 crc kubenswrapper[4749]: I1129 01:17:54.868250 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lk96f" event={"ID":"ef97226b-aa25-4088-a39b-0015a132dd8c","Type":"ContainerStarted","Data":"9e22912ac961882f9bafda4bc249faaec3d1f9250ff15992f7f39de5a6b7fb0b"} Nov 29 01:17:54 crc kubenswrapper[4749]: I1129 01:17:54.880911 4749 generic.go:334] "Generic (PLEG): container finished" podID="b44c30b9-6b5c-40fd-9f73-0072f941ffeb" containerID="cf798473cef82d559891623902dd39a514322a5dbe8d5a26e99f26be97462421" exitCode=0 Nov 29 01:17:54 crc kubenswrapper[4749]: I1129 01:17:54.881026 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzcw" event={"ID":"b44c30b9-6b5c-40fd-9f73-0072f941ffeb","Type":"ContainerDied","Data":"cf798473cef82d559891623902dd39a514322a5dbe8d5a26e99f26be97462421"} Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.328610 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-48gh2"] Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.330031 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.338131 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.379320 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.379921 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.380000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.380367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec2edf35-170e-4586-8e6f-c563db51b6b7-utilities\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.380583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec2edf35-170e-4586-8e6f-c563db51b6b7-catalog-content\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.380725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5szh\" (UniqueName: \"kubernetes.io/projected/ec2edf35-170e-4586-8e6f-c563db51b6b7-kube-api-access-d5szh\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.381032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48gh2"] Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.382157 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12a23a1e26ba2ea152b73c69f4cde029afff51ef605b21a0ff3648730e7a0a26"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.382336 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://12a23a1e26ba2ea152b73c69f4cde029afff51ef605b21a0ff3648730e7a0a26" gracePeriod=600 Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.482941 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec2edf35-170e-4586-8e6f-c563db51b6b7-utilities\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.483066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec2edf35-170e-4586-8e6f-c563db51b6b7-catalog-content\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.483151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5szh\" (UniqueName: \"kubernetes.io/projected/ec2edf35-170e-4586-8e6f-c563db51b6b7-kube-api-access-d5szh\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.483903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec2edf35-170e-4586-8e6f-c563db51b6b7-utilities\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.483909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec2edf35-170e-4586-8e6f-c563db51b6b7-catalog-content\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.511120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5szh\" (UniqueName: \"kubernetes.io/projected/ec2edf35-170e-4586-8e6f-c563db51b6b7-kube-api-access-d5szh\") pod \"certified-operators-48gh2\" (UID: \"ec2edf35-170e-4586-8e6f-c563db51b6b7\") " pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.528006 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cgq68"] Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.529580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.539119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.575745 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cgq68"] Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.588970 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-utilities\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.589027 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-catalog-content\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.589063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97kkn\" (UniqueName: \"kubernetes.io/projected/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-kube-api-access-97kkn\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.690860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-utilities\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.690914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-catalog-content\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.690957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97kkn\" (UniqueName: \"kubernetes.io/projected/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-kube-api-access-97kkn\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.692058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-utilities\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.692919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-catalog-content\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.696810 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.724731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97kkn\" (UniqueName: \"kubernetes.io/projected/7d47bb92-91e4-4b99-9c6a-86ec5c95396a-kube-api-access-97kkn\") pod \"community-operators-cgq68\" (UID: \"7d47bb92-91e4-4b99-9c6a-86ec5c95396a\") " pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.888629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.900464 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="12a23a1e26ba2ea152b73c69f4cde029afff51ef605b21a0ff3648730e7a0a26" exitCode=0 Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.900514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"12a23a1e26ba2ea152b73c69f4cde029afff51ef605b21a0ff3648730e7a0a26"} Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.900551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"935d815a0d55fee720bd751cb0ba28587e2781ad183f1c9c6a5f793ff1bc7147"} Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.900574 4749 scope.go:117] "RemoveContainer" containerID="d31ad0cebf4294eac40dc0bdc9cfa18cda5ba4ed54fd25ada984c17380395c1a" Nov 29 01:17:55 crc kubenswrapper[4749]: I1129 01:17:55.956801 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48gh2"] Nov 29 01:17:55 crc kubenswrapper[4749]: W1129 01:17:55.959986 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2edf35_170e_4586_8e6f_c563db51b6b7.slice/crio-fc7978405c0843c01c79924bbc1f6b16aa6804954d5e4c1671b1499937920f26 WatchSource:0}: Error finding container fc7978405c0843c01c79924bbc1f6b16aa6804954d5e4c1671b1499937920f26: Status 404 returned error can't find the container with id fc7978405c0843c01c79924bbc1f6b16aa6804954d5e4c1671b1499937920f26 Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.146865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cgq68"] Nov 29 01:17:56 crc kubenswrapper[4749]: W1129 01:17:56.289377 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d47bb92_91e4_4b99_9c6a_86ec5c95396a.slice/crio-bcfbc7a44994cc883bdb4e0a3a335beb51c6fb737063f97474b167f30233f28d WatchSource:0}: Error finding container bcfbc7a44994cc883bdb4e0a3a335beb51c6fb737063f97474b167f30233f28d: Status 404 returned error can't find the container with id bcfbc7a44994cc883bdb4e0a3a335beb51c6fb737063f97474b167f30233f28d Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.431484 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" podUID="21e2450f-f4fe-41bd-bbc9-abcc3f03400d" containerName="registry" containerID="cri-o://d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164" gracePeriod=30 Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.868572 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.912604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-ca-trust-extracted\") pod \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.912786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.912908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-trusted-ca\") pod \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.912938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-bound-sa-token\") pod \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.912993 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-installation-pull-secrets\") pod \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.913026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tswkr\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-kube-api-access-tswkr\") pod \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.913056 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-tls\") pod \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.913088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-certificates\") pod \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\" (UID: \"21e2450f-f4fe-41bd-bbc9-abcc3f03400d\") " Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.914016 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "21e2450f-f4fe-41bd-bbc9-abcc3f03400d" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.914910 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec2edf35-170e-4586-8e6f-c563db51b6b7" containerID="2b0a6b59eec8bb98f32a0b85f481c960515163daf487a04e929e2abb05ba681b" exitCode=0 Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.915286 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48gh2" event={"ID":"ec2edf35-170e-4586-8e6f-c563db51b6b7","Type":"ContainerDied","Data":"2b0a6b59eec8bb98f32a0b85f481c960515163daf487a04e929e2abb05ba681b"} Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.915327 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48gh2" event={"ID":"ec2edf35-170e-4586-8e6f-c563db51b6b7","Type":"ContainerStarted","Data":"fc7978405c0843c01c79924bbc1f6b16aa6804954d5e4c1671b1499937920f26"} Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.919550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "21e2450f-f4fe-41bd-bbc9-abcc3f03400d" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.923405 4749 generic.go:334] "Generic (PLEG): container finished" podID="b44c30b9-6b5c-40fd-9f73-0072f941ffeb" containerID="1ad90295820cbbfbe2c0b9031a187ecac5ab3aab962c24fd52ea49eaa0e246ae" exitCode=0 Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.923495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzcw" event={"ID":"b44c30b9-6b5c-40fd-9f73-0072f941ffeb","Type":"ContainerDied","Data":"1ad90295820cbbfbe2c0b9031a187ecac5ab3aab962c24fd52ea49eaa0e246ae"} Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.923862 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "21e2450f-f4fe-41bd-bbc9-abcc3f03400d" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.926105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "21e2450f-f4fe-41bd-bbc9-abcc3f03400d" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.927096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "21e2450f-f4fe-41bd-bbc9-abcc3f03400d" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.928048 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-kube-api-access-tswkr" (OuterVolumeSpecName: "kube-api-access-tswkr") pod "21e2450f-f4fe-41bd-bbc9-abcc3f03400d" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d"). InnerVolumeSpecName "kube-api-access-tswkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.928372 4749 generic.go:334] "Generic (PLEG): container finished" podID="21e2450f-f4fe-41bd-bbc9-abcc3f03400d" containerID="d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164" exitCode=0 Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.928534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" event={"ID":"21e2450f-f4fe-41bd-bbc9-abcc3f03400d","Type":"ContainerDied","Data":"d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164"} Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.928526 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.928664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hkqth" event={"ID":"21e2450f-f4fe-41bd-bbc9-abcc3f03400d","Type":"ContainerDied","Data":"6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d"} Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.928707 4749 scope.go:117] "RemoveContainer" containerID="d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.930179 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "21e2450f-f4fe-41bd-bbc9-abcc3f03400d" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.932703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "21e2450f-f4fe-41bd-bbc9-abcc3f03400d" (UID: "21e2450f-f4fe-41bd-bbc9-abcc3f03400d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.943905 4749 generic.go:334] "Generic (PLEG): container finished" podID="7d47bb92-91e4-4b99-9c6a-86ec5c95396a" containerID="a62a3cd73848552d0566b5be9098f59db575300f66dd4218a878bdb1d2839f4f" exitCode=0 Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.944525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgq68" event={"ID":"7d47bb92-91e4-4b99-9c6a-86ec5c95396a","Type":"ContainerDied","Data":"a62a3cd73848552d0566b5be9098f59db575300f66dd4218a878bdb1d2839f4f"} Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.944592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgq68" event={"ID":"7d47bb92-91e4-4b99-9c6a-86ec5c95396a","Type":"ContainerStarted","Data":"bcfbc7a44994cc883bdb4e0a3a335beb51c6fb737063f97474b167f30233f28d"} Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.954355 4749 generic.go:334] "Generic (PLEG): container finished" podID="ef97226b-aa25-4088-a39b-0015a132dd8c" containerID="66433485c8f0976bea2a7d95ff858c4c22ab3e3d9d2cb5c690e43a5bc21d068c" exitCode=0 Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.954407 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lk96f" event={"ID":"ef97226b-aa25-4088-a39b-0015a132dd8c","Type":"ContainerDied","Data":"66433485c8f0976bea2a7d95ff858c4c22ab3e3d9d2cb5c690e43a5bc21d068c"} Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.969955 4749 scope.go:117] "RemoveContainer" containerID="d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164" Nov 29 01:17:56 crc kubenswrapper[4749]: E1129 01:17:56.970695 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164\": container with ID starting with d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164 not found: ID does not exist" containerID="d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164" Nov 29 01:17:56 crc kubenswrapper[4749]: I1129 01:17:56.970764 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164"} err="failed to get container status \"d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164\": rpc error: code = NotFound desc = could not find container \"d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164\": container with ID starting with d5b6af5929c9c288e2db4a85e0409cac7a67b507cfe0361251271f9c54c9a164 not found: ID does not exist" Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.015141 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.015177 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.015225 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.015241 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.015260 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.015275 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tswkr\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-kube-api-access-tswkr\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.015290 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/21e2450f-f4fe-41bd-bbc9-abcc3f03400d-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.256124 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkqth"] Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.259210 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hkqth"] Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.964491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lk96f" event={"ID":"ef97226b-aa25-4088-a39b-0015a132dd8c","Type":"ContainerStarted","Data":"77fc6c84a7ba98aa33456fb9bc310ab357c6037d6f90844c6052a0b206e401db"} Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.968682 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec2edf35-170e-4586-8e6f-c563db51b6b7" containerID="6543c5534a5affa9f24d5c58d97d5d97b2a2193865390730cb1325c48b863578" exitCode=0 Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.968879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48gh2" event={"ID":"ec2edf35-170e-4586-8e6f-c563db51b6b7","Type":"ContainerDied","Data":"6543c5534a5affa9f24d5c58d97d5d97b2a2193865390730cb1325c48b863578"} Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.975688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvzcw" event={"ID":"b44c30b9-6b5c-40fd-9f73-0072f941ffeb","Type":"ContainerStarted","Data":"63cd2a5e63331cd783a7af775099b35eeaf9d1eb379929eb01d8b01572c7ed39"} Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.982561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgq68" event={"ID":"7d47bb92-91e4-4b99-9c6a-86ec5c95396a","Type":"ContainerStarted","Data":"7bfc32fac76e3bee777a7a19a372e2c3dc4d134233a45f325103d13b94d32ca6"} Nov 29 01:17:57 crc kubenswrapper[4749]: I1129 01:17:57.986157 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lk96f" podStartSLOduration=2.455886758 podStartE2EDuration="4.986131433s" podCreationTimestamp="2025-11-29 01:17:53 +0000 UTC" firstStartedPulling="2025-11-29 01:17:54.873372943 +0000 UTC m=+418.045522840" lastFinishedPulling="2025-11-29 01:17:57.403617658 +0000 UTC m=+420.575767515" observedRunningTime="2025-11-29 01:17:57.982033372 +0000 UTC m=+421.154183249" watchObservedRunningTime="2025-11-29 01:17:57.986131433 +0000 UTC m=+421.158281290" Nov 29 01:17:58 crc kubenswrapper[4749]: I1129 01:17:58.031523 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rvzcw" podStartSLOduration=3.448149478 podStartE2EDuration="6.031493641s" podCreationTimestamp="2025-11-29 01:17:52 +0000 UTC" firstStartedPulling="2025-11-29 01:17:54.886576828 +0000 UTC m=+418.058726725" lastFinishedPulling="2025-11-29 01:17:57.469921031 +0000 UTC m=+420.642070888" observedRunningTime="2025-11-29 01:17:58.030315812 +0000 UTC m=+421.202465669" watchObservedRunningTime="2025-11-29 01:17:58.031493641 +0000 UTC m=+421.203643498" Nov 29 01:17:58 crc kubenswrapper[4749]: I1129 01:17:58.991562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48gh2" event={"ID":"ec2edf35-170e-4586-8e6f-c563db51b6b7","Type":"ContainerStarted","Data":"1de99ffb032a48cc25acbe5f5b58c02608d7e8b478a72d6a4408db02b8ee4021"} Nov 29 01:17:58 crc kubenswrapper[4749]: I1129 01:17:58.997442 4749 generic.go:334] "Generic (PLEG): container finished" podID="7d47bb92-91e4-4b99-9c6a-86ec5c95396a" containerID="7bfc32fac76e3bee777a7a19a372e2c3dc4d134233a45f325103d13b94d32ca6" exitCode=0 Nov 29 01:17:59 crc kubenswrapper[4749]: I1129 01:17:58.998778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgq68" event={"ID":"7d47bb92-91e4-4b99-9c6a-86ec5c95396a","Type":"ContainerDied","Data":"7bfc32fac76e3bee777a7a19a372e2c3dc4d134233a45f325103d13b94d32ca6"} Nov 29 01:17:59 crc kubenswrapper[4749]: I1129 01:17:59.015101 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-48gh2" podStartSLOduration=2.558414241 podStartE2EDuration="4.015072749s" podCreationTimestamp="2025-11-29 01:17:55 +0000 UTC" firstStartedPulling="2025-11-29 01:17:56.921857005 +0000 UTC m=+420.094006862" lastFinishedPulling="2025-11-29 01:17:58.378515513 +0000 UTC m=+421.550665370" observedRunningTime="2025-11-29 01:17:59.013662785 +0000 UTC m=+422.185812652" watchObservedRunningTime="2025-11-29 01:17:59.015072749 +0000 UTC m=+422.187222616" Nov 29 01:17:59 crc kubenswrapper[4749]: I1129 01:17:59.082459 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21e2450f-f4fe-41bd-bbc9-abcc3f03400d" path="/var/lib/kubelet/pods/21e2450f-f4fe-41bd-bbc9-abcc3f03400d/volumes" Nov 29 01:18:01 crc kubenswrapper[4749]: I1129 01:18:01.035530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgq68" event={"ID":"7d47bb92-91e4-4b99-9c6a-86ec5c95396a","Type":"ContainerStarted","Data":"cf6174edaf5914cfe415b054c9bf4681ea48bc15d1baaf0b957012132b9c4bec"} Nov 29 01:18:01 crc kubenswrapper[4749]: I1129 01:18:01.058225 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cgq68" podStartSLOduration=3.559398751 podStartE2EDuration="6.05817315s" podCreationTimestamp="2025-11-29 01:17:55 +0000 UTC" firstStartedPulling="2025-11-29 01:17:56.948892151 +0000 UTC m=+420.121042008" lastFinishedPulling="2025-11-29 01:17:59.44766655 +0000 UTC m=+422.619816407" observedRunningTime="2025-11-29 01:18:01.053245548 +0000 UTC m=+424.225395455" watchObservedRunningTime="2025-11-29 01:18:01.05817315 +0000 UTC m=+424.230323017" Nov 29 01:18:02 crc kubenswrapper[4749]: E1129 01:18:02.592807 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice/crio-6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d\": RecentStats: unable to find data in memory cache]" Nov 29 01:18:03 crc kubenswrapper[4749]: I1129 01:18:03.273432 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:18:03 crc kubenswrapper[4749]: I1129 01:18:03.273677 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:18:03 crc kubenswrapper[4749]: I1129 01:18:03.328878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:18:03 crc kubenswrapper[4749]: I1129 01:18:03.463352 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:18:03 crc kubenswrapper[4749]: I1129 01:18:03.463886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:18:03 crc kubenswrapper[4749]: I1129 01:18:03.525493 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:18:04 crc kubenswrapper[4749]: I1129 01:18:04.142741 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rvzcw" Nov 29 01:18:04 crc kubenswrapper[4749]: I1129 01:18:04.153724 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lk96f" Nov 29 01:18:05 crc kubenswrapper[4749]: I1129 01:18:05.697623 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:18:05 crc kubenswrapper[4749]: I1129 01:18:05.698090 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:18:05 crc kubenswrapper[4749]: I1129 01:18:05.772793 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:18:05 crc kubenswrapper[4749]: I1129 01:18:05.889893 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:18:05 crc kubenswrapper[4749]: I1129 01:18:05.889981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:18:05 crc kubenswrapper[4749]: I1129 01:18:05.957088 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:18:06 crc kubenswrapper[4749]: I1129 01:18:06.120139 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cgq68" Nov 29 01:18:06 crc kubenswrapper[4749]: I1129 01:18:06.121462 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-48gh2" Nov 29 01:18:12 crc kubenswrapper[4749]: E1129 01:18:12.722654 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice/crio-6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice\": RecentStats: unable to find data in memory cache]" Nov 29 01:18:22 crc kubenswrapper[4749]: E1129 01:18:22.904922 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice/crio-6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d\": RecentStats: unable to find data in memory cache]" Nov 29 01:18:33 crc kubenswrapper[4749]: E1129 01:18:33.077685 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice/crio-6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice\": RecentStats: unable to find data in memory cache]" Nov 29 01:18:43 crc kubenswrapper[4749]: E1129 01:18:43.211333 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice/crio-6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice\": RecentStats: unable to find data in memory cache]" Nov 29 01:18:53 crc kubenswrapper[4749]: E1129 01:18:53.334124 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice/crio-6a92157d38d4149758a83825adfc72a0f6d69691a7147db54e67b703bd36038d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e2450f_f4fe_41bd_bbc9_abcc3f03400d.slice\": RecentStats: unable to find data in memory cache]" Nov 29 01:18:57 crc kubenswrapper[4749]: E1129 01:18:57.115147 4749 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/996caaf64bc6ef90c91b893cf54bd6817d44b40705ae889fa9d2044a20a35db4/diff" to get inode usage: stat /var/lib/containers/storage/overlay/996caaf64bc6ef90c91b893cf54bd6817d44b40705ae889fa9d2044a20a35db4/diff: no such file or directory, extraDiskErr: Nov 29 01:19:55 crc kubenswrapper[4749]: I1129 01:19:55.374889 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:19:55 crc kubenswrapper[4749]: I1129 01:19:55.376231 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:20:25 crc kubenswrapper[4749]: I1129 01:20:25.374181 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:20:25 crc kubenswrapper[4749]: I1129 01:20:25.375282 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:20:55 crc kubenswrapper[4749]: I1129 01:20:55.374091 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:20:55 crc kubenswrapper[4749]: I1129 01:20:55.375153 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:20:55 crc kubenswrapper[4749]: I1129 01:20:55.375258 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:20:55 crc kubenswrapper[4749]: I1129 01:20:55.376237 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"935d815a0d55fee720bd751cb0ba28587e2781ad183f1c9c6a5f793ff1bc7147"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:20:55 crc kubenswrapper[4749]: I1129 01:20:55.376315 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://935d815a0d55fee720bd751cb0ba28587e2781ad183f1c9c6a5f793ff1bc7147" gracePeriod=600 Nov 29 01:20:56 crc kubenswrapper[4749]: I1129 01:20:56.150145 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="935d815a0d55fee720bd751cb0ba28587e2781ad183f1c9c6a5f793ff1bc7147" exitCode=0 Nov 29 01:20:56 crc kubenswrapper[4749]: I1129 01:20:56.150254 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"935d815a0d55fee720bd751cb0ba28587e2781ad183f1c9c6a5f793ff1bc7147"} Nov 29 01:20:56 crc kubenswrapper[4749]: I1129 01:20:56.150984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"62034ef1cdc0575fc9032dd2650147700f908c8a4b290d41887f55a1ffc76581"} Nov 29 01:20:56 crc kubenswrapper[4749]: I1129 01:20:56.151022 4749 scope.go:117] "RemoveContainer" containerID="12a23a1e26ba2ea152b73c69f4cde029afff51ef605b21a0ff3648730e7a0a26" Nov 29 01:22:55 crc kubenswrapper[4749]: I1129 01:22:55.374787 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:22:55 crc kubenswrapper[4749]: I1129 01:22:55.375718 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:23:25 crc kubenswrapper[4749]: I1129 01:23:25.374930 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:23:25 crc kubenswrapper[4749]: I1129 01:23:25.376008 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:23:32 crc kubenswrapper[4749]: I1129 01:23:32.877671 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 29 01:23:47 crc kubenswrapper[4749]: I1129 01:23:47.799738 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w92sr"] Nov 29 01:23:47 crc kubenswrapper[4749]: E1129 01:23:47.801753 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e2450f-f4fe-41bd-bbc9-abcc3f03400d" containerName="registry" Nov 29 01:23:47 crc kubenswrapper[4749]: I1129 01:23:47.801783 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e2450f-f4fe-41bd-bbc9-abcc3f03400d" containerName="registry" Nov 29 01:23:47 crc kubenswrapper[4749]: I1129 01:23:47.802049 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e2450f-f4fe-41bd-bbc9-abcc3f03400d" containerName="registry" Nov 29 01:23:47 crc kubenswrapper[4749]: I1129 01:23:47.804186 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:47 crc kubenswrapper[4749]: I1129 01:23:47.811329 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w92sr"] Nov 29 01:23:47 crc kubenswrapper[4749]: I1129 01:23:47.995314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-catalog-content\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:47 crc kubenswrapper[4749]: I1129 01:23:47.995403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7nt6\" (UniqueName: \"kubernetes.io/projected/76d3ce82-58dd-45e4-aaed-d9375266ad40-kube-api-access-d7nt6\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:47 crc kubenswrapper[4749]: I1129 01:23:47.995502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-utilities\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:48 crc kubenswrapper[4749]: I1129 01:23:48.097271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-catalog-content\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:48 crc kubenswrapper[4749]: I1129 01:23:48.097373 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7nt6\" (UniqueName: \"kubernetes.io/projected/76d3ce82-58dd-45e4-aaed-d9375266ad40-kube-api-access-d7nt6\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:48 crc kubenswrapper[4749]: I1129 01:23:48.097481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-utilities\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:48 crc kubenswrapper[4749]: I1129 01:23:48.098455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-catalog-content\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:48 crc kubenswrapper[4749]: I1129 01:23:48.098690 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-utilities\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:48 crc kubenswrapper[4749]: I1129 01:23:48.134575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7nt6\" (UniqueName: \"kubernetes.io/projected/76d3ce82-58dd-45e4-aaed-d9375266ad40-kube-api-access-d7nt6\") pod \"certified-operators-w92sr\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:48 crc kubenswrapper[4749]: I1129 01:23:48.431291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:48 crc kubenswrapper[4749]: I1129 01:23:48.768382 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w92sr"] Nov 29 01:23:49 crc kubenswrapper[4749]: I1129 01:23:49.780006 4749 generic.go:334] "Generic (PLEG): container finished" podID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerID="69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b" exitCode=0 Nov 29 01:23:49 crc kubenswrapper[4749]: I1129 01:23:49.780125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w92sr" event={"ID":"76d3ce82-58dd-45e4-aaed-d9375266ad40","Type":"ContainerDied","Data":"69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b"} Nov 29 01:23:49 crc kubenswrapper[4749]: I1129 01:23:49.781551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w92sr" event={"ID":"76d3ce82-58dd-45e4-aaed-d9375266ad40","Type":"ContainerStarted","Data":"8cd722a8455a8083ca25c351a9b4a2f282fb5690006ae880fc6e625bc078a2f4"} Nov 29 01:23:49 crc kubenswrapper[4749]: I1129 01:23:49.786682 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 01:23:51 crc kubenswrapper[4749]: I1129 01:23:51.800856 4749 generic.go:334] "Generic (PLEG): container finished" podID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerID="c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802" exitCode=0 Nov 29 01:23:51 crc kubenswrapper[4749]: I1129 01:23:51.801335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w92sr" event={"ID":"76d3ce82-58dd-45e4-aaed-d9375266ad40","Type":"ContainerDied","Data":"c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802"} Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.360361 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gws5t"] Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.362637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.391328 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gws5t"] Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.470960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn8dj\" (UniqueName: \"kubernetes.io/projected/1e8a1436-4a73-4fa6-8c63-83e21c68896f-kube-api-access-tn8dj\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.471081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-utilities\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.471139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-catalog-content\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.572354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-utilities\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.572442 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-catalog-content\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.572533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn8dj\" (UniqueName: \"kubernetes.io/projected/1e8a1436-4a73-4fa6-8c63-83e21c68896f-kube-api-access-tn8dj\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.573221 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-utilities\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.573340 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-catalog-content\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.594700 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn8dj\" (UniqueName: \"kubernetes.io/projected/1e8a1436-4a73-4fa6-8c63-83e21c68896f-kube-api-access-tn8dj\") pod \"redhat-marketplace-gws5t\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.702566 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.819551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w92sr" event={"ID":"76d3ce82-58dd-45e4-aaed-d9375266ad40","Type":"ContainerStarted","Data":"d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81"} Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.895858 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w92sr" podStartSLOduration=3.368537444 podStartE2EDuration="5.895839445s" podCreationTimestamp="2025-11-29 01:23:47 +0000 UTC" firstStartedPulling="2025-11-29 01:23:49.786049311 +0000 UTC m=+772.958199208" lastFinishedPulling="2025-11-29 01:23:52.313351312 +0000 UTC m=+775.485501209" observedRunningTime="2025-11-29 01:23:52.89061714 +0000 UTC m=+776.062766987" watchObservedRunningTime="2025-11-29 01:23:52.895839445 +0000 UTC m=+776.067989312" Nov 29 01:23:52 crc kubenswrapper[4749]: I1129 01:23:52.991149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gws5t"] Nov 29 01:23:53 crc kubenswrapper[4749]: W1129 01:23:53.000108 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8a1436_4a73_4fa6_8c63_83e21c68896f.slice/crio-36d23f7bd3ef9e2defc02509b98fcba06d38ec6ccd2ce24c5cda295617918615 WatchSource:0}: Error finding container 36d23f7bd3ef9e2defc02509b98fcba06d38ec6ccd2ce24c5cda295617918615: Status 404 returned error can't find the container with id 36d23f7bd3ef9e2defc02509b98fcba06d38ec6ccd2ce24c5cda295617918615 Nov 29 01:23:53 crc kubenswrapper[4749]: I1129 01:23:53.836168 4749 generic.go:334] "Generic (PLEG): container finished" podID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerID="a12ac2ab5fbf906258ccf99929397fad58361eff1a9458e9853f062dde05d70c" exitCode=0 Nov 29 01:23:53 crc kubenswrapper[4749]: I1129 01:23:53.836323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gws5t" event={"ID":"1e8a1436-4a73-4fa6-8c63-83e21c68896f","Type":"ContainerDied","Data":"a12ac2ab5fbf906258ccf99929397fad58361eff1a9458e9853f062dde05d70c"} Nov 29 01:23:53 crc kubenswrapper[4749]: I1129 01:23:53.836753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gws5t" event={"ID":"1e8a1436-4a73-4fa6-8c63-83e21c68896f","Type":"ContainerStarted","Data":"36d23f7bd3ef9e2defc02509b98fcba06d38ec6ccd2ce24c5cda295617918615"} Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.375313 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.375787 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.375886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.379182 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62034ef1cdc0575fc9032dd2650147700f908c8a4b290d41887f55a1ffc76581"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.379349 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://62034ef1cdc0575fc9032dd2650147700f908c8a4b290d41887f55a1ffc76581" gracePeriod=600 Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.852662 4749 generic.go:334] "Generic (PLEG): container finished" podID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerID="6094afbe6d22fd1aa2fd53f5b738f82b724ae325e8a96017f295677eea23c7c4" exitCode=0 Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.852747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gws5t" event={"ID":"1e8a1436-4a73-4fa6-8c63-83e21c68896f","Type":"ContainerDied","Data":"6094afbe6d22fd1aa2fd53f5b738f82b724ae325e8a96017f295677eea23c7c4"} Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.855864 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="62034ef1cdc0575fc9032dd2650147700f908c8a4b290d41887f55a1ffc76581" exitCode=0 Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.855886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"62034ef1cdc0575fc9032dd2650147700f908c8a4b290d41887f55a1ffc76581"} Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.855904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"9302b61a72148487837a4aeb2ccc5c42240573bc5890594b41af31b7f42617b2"} Nov 29 01:23:55 crc kubenswrapper[4749]: I1129 01:23:55.855922 4749 scope.go:117] "RemoveContainer" containerID="935d815a0d55fee720bd751cb0ba28587e2781ad183f1c9c6a5f793ff1bc7147" Nov 29 01:23:56 crc kubenswrapper[4749]: I1129 01:23:56.872776 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gws5t" event={"ID":"1e8a1436-4a73-4fa6-8c63-83e21c68896f","Type":"ContainerStarted","Data":"13722be24a1de9f60c1776074ba6e3dabc398eeb37ddb2a4bb5effc369771834"} Nov 29 01:23:56 crc kubenswrapper[4749]: I1129 01:23:56.904030 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gws5t" podStartSLOduration=2.378785965 podStartE2EDuration="4.903962399s" podCreationTimestamp="2025-11-29 01:23:52 +0000 UTC" firstStartedPulling="2025-11-29 01:23:53.839335693 +0000 UTC m=+777.011485570" lastFinishedPulling="2025-11-29 01:23:56.364512147 +0000 UTC m=+779.536662004" observedRunningTime="2025-11-29 01:23:56.898450665 +0000 UTC m=+780.070600562" watchObservedRunningTime="2025-11-29 01:23:56.903962399 +0000 UTC m=+780.076112286" Nov 29 01:23:58 crc kubenswrapper[4749]: I1129 01:23:58.432289 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:58 crc kubenswrapper[4749]: I1129 01:23:58.432763 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:58 crc kubenswrapper[4749]: I1129 01:23:58.499859 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:58 crc kubenswrapper[4749]: I1129 01:23:58.939484 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.558711 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xqg5b"] Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.562474 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.594991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-catalog-content\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.595595 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-utilities\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.595877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcs8x\" (UniqueName: \"kubernetes.io/projected/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-kube-api-access-gcs8x\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.611950 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqg5b"] Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.696681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-catalog-content\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.697092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-utilities\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.697247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcs8x\" (UniqueName: \"kubernetes.io/projected/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-kube-api-access-gcs8x\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.697997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-catalog-content\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.698018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-utilities\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.737411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcs8x\" (UniqueName: \"kubernetes.io/projected/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-kube-api-access-gcs8x\") pod \"redhat-operators-xqg5b\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:23:59 crc kubenswrapper[4749]: I1129 01:23:59.908820 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:24:00 crc kubenswrapper[4749]: I1129 01:24:00.404436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqg5b"] Nov 29 01:24:00 crc kubenswrapper[4749]: W1129 01:24:00.414238 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dd237ae_d590_4a0d_81a7_afec1fb19bdb.slice/crio-9a5292b2e5fce6423f0b3c7880891f171778a0ac1c250d6b9809357e553cfb08 WatchSource:0}: Error finding container 9a5292b2e5fce6423f0b3c7880891f171778a0ac1c250d6b9809357e553cfb08: Status 404 returned error can't find the container with id 9a5292b2e5fce6423f0b3c7880891f171778a0ac1c250d6b9809357e553cfb08 Nov 29 01:24:00 crc kubenswrapper[4749]: I1129 01:24:00.908628 4749 generic.go:334] "Generic (PLEG): container finished" podID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerID="4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf" exitCode=0 Nov 29 01:24:00 crc kubenswrapper[4749]: I1129 01:24:00.908738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqg5b" event={"ID":"7dd237ae-d590-4a0d-81a7-afec1fb19bdb","Type":"ContainerDied","Data":"4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf"} Nov 29 01:24:00 crc kubenswrapper[4749]: I1129 01:24:00.909223 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqg5b" event={"ID":"7dd237ae-d590-4a0d-81a7-afec1fb19bdb","Type":"ContainerStarted","Data":"9a5292b2e5fce6423f0b3c7880891f171778a0ac1c250d6b9809357e553cfb08"} Nov 29 01:24:01 crc kubenswrapper[4749]: I1129 01:24:01.346084 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w92sr"] Nov 29 01:24:01 crc kubenswrapper[4749]: I1129 01:24:01.346494 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w92sr" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerName="registry-server" containerID="cri-o://d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81" gracePeriod=2 Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.241596 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.437908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-utilities\") pod \"76d3ce82-58dd-45e4-aaed-d9375266ad40\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.438034 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7nt6\" (UniqueName: \"kubernetes.io/projected/76d3ce82-58dd-45e4-aaed-d9375266ad40-kube-api-access-d7nt6\") pod \"76d3ce82-58dd-45e4-aaed-d9375266ad40\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.438236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-catalog-content\") pod \"76d3ce82-58dd-45e4-aaed-d9375266ad40\" (UID: \"76d3ce82-58dd-45e4-aaed-d9375266ad40\") " Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.440107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-utilities" (OuterVolumeSpecName: "utilities") pod "76d3ce82-58dd-45e4-aaed-d9375266ad40" (UID: "76d3ce82-58dd-45e4-aaed-d9375266ad40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.449403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d3ce82-58dd-45e4-aaed-d9375266ad40-kube-api-access-d7nt6" (OuterVolumeSpecName: "kube-api-access-d7nt6") pod "76d3ce82-58dd-45e4-aaed-d9375266ad40" (UID: "76d3ce82-58dd-45e4-aaed-d9375266ad40"). InnerVolumeSpecName "kube-api-access-d7nt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.524969 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76d3ce82-58dd-45e4-aaed-d9375266ad40" (UID: "76d3ce82-58dd-45e4-aaed-d9375266ad40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.540128 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.540182 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76d3ce82-58dd-45e4-aaed-d9375266ad40-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.540215 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7nt6\" (UniqueName: \"kubernetes.io/projected/76d3ce82-58dd-45e4-aaed-d9375266ad40-kube-api-access-d7nt6\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.703411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.703524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.785853 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.931764 4749 generic.go:334] "Generic (PLEG): container finished" podID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerID="d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81" exitCode=0 Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.931883 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w92sr" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.931880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w92sr" event={"ID":"76d3ce82-58dd-45e4-aaed-d9375266ad40","Type":"ContainerDied","Data":"d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81"} Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.931988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w92sr" event={"ID":"76d3ce82-58dd-45e4-aaed-d9375266ad40","Type":"ContainerDied","Data":"8cd722a8455a8083ca25c351a9b4a2f282fb5690006ae880fc6e625bc078a2f4"} Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.932024 4749 scope.go:117] "RemoveContainer" containerID="d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81" Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.935562 4749 generic.go:334] "Generic (PLEG): container finished" podID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerID="c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48" exitCode=0 Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.935611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqg5b" event={"ID":"7dd237ae-d590-4a0d-81a7-afec1fb19bdb","Type":"ContainerDied","Data":"c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48"} Nov 29 01:24:02 crc kubenswrapper[4749]: I1129 01:24:02.980952 4749 scope.go:117] "RemoveContainer" containerID="c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.000007 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w92sr"] Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.005484 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w92sr"] Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.016123 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.019618 4749 scope.go:117] "RemoveContainer" containerID="69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.066121 4749 scope.go:117] "RemoveContainer" containerID="d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81" Nov 29 01:24:03 crc kubenswrapper[4749]: E1129 01:24:03.067025 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81\": container with ID starting with d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81 not found: ID does not exist" containerID="d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.067089 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81"} err="failed to get container status \"d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81\": rpc error: code = NotFound desc = could not find container \"d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81\": container with ID starting with d9a31ac8e8abb7fa9160aa56fab40cd9448f21cc9f8f0f6dd583823bbff1de81 not found: ID does not exist" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.067131 4749 scope.go:117] "RemoveContainer" containerID="c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802" Nov 29 01:24:03 crc kubenswrapper[4749]: E1129 01:24:03.067781 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802\": container with ID starting with c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802 not found: ID does not exist" containerID="c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.067822 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802"} err="failed to get container status \"c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802\": rpc error: code = NotFound desc = could not find container \"c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802\": container with ID starting with c498faa93c6654a84ea9b39774ff70c91e2464830723b8b70bf8c285ac3a7802 not found: ID does not exist" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.067848 4749 scope.go:117] "RemoveContainer" containerID="69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b" Nov 29 01:24:03 crc kubenswrapper[4749]: E1129 01:24:03.068503 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b\": container with ID starting with 69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b not found: ID does not exist" containerID="69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.068540 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b"} err="failed to get container status \"69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b\": rpc error: code = NotFound desc = could not find container \"69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b\": container with ID starting with 69259093cc2ee5db7046f9775770861026a1159d488e8e5f10f1808b81e9da9b not found: ID does not exist" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.088884 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" path="/var/lib/kubelet/pods/76d3ce82-58dd-45e4-aaed-d9375266ad40/volumes" Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.950770 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqg5b" event={"ID":"7dd237ae-d590-4a0d-81a7-afec1fb19bdb","Type":"ContainerStarted","Data":"777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867"} Nov 29 01:24:03 crc kubenswrapper[4749]: I1129 01:24:03.989879 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xqg5b" podStartSLOduration=2.490891753 podStartE2EDuration="4.989842607s" podCreationTimestamp="2025-11-29 01:23:59 +0000 UTC" firstStartedPulling="2025-11-29 01:24:00.911086742 +0000 UTC m=+784.083236619" lastFinishedPulling="2025-11-29 01:24:03.410037576 +0000 UTC m=+786.582187473" observedRunningTime="2025-11-29 01:24:03.981847663 +0000 UTC m=+787.153997560" watchObservedRunningTime="2025-11-29 01:24:03.989842607 +0000 UTC m=+787.161992494" Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.753996 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gws5t"] Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.754520 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gws5t" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerName="registry-server" containerID="cri-o://13722be24a1de9f60c1776074ba6e3dabc398eeb37ddb2a4bb5effc369771834" gracePeriod=2 Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.964162 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vk2wd"] Nov 29 01:24:05 crc kubenswrapper[4749]: E1129 01:24:05.964560 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerName="extract-utilities" Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.964592 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerName="extract-utilities" Nov 29 01:24:05 crc kubenswrapper[4749]: E1129 01:24:05.964606 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerName="extract-content" Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.964614 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerName="extract-content" Nov 29 01:24:05 crc kubenswrapper[4749]: E1129 01:24:05.964637 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerName="registry-server" Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.964648 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerName="registry-server" Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.964819 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d3ce82-58dd-45e4-aaed-d9375266ad40" containerName="registry-server" Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.965978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.979931 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vk2wd"] Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.990222 4749 generic.go:334] "Generic (PLEG): container finished" podID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerID="13722be24a1de9f60c1776074ba6e3dabc398eeb37ddb2a4bb5effc369771834" exitCode=0 Nov 29 01:24:05 crc kubenswrapper[4749]: I1129 01:24:05.990291 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gws5t" event={"ID":"1e8a1436-4a73-4fa6-8c63-83e21c68896f","Type":"ContainerDied","Data":"13722be24a1de9f60c1776074ba6e3dabc398eeb37ddb2a4bb5effc369771834"} Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.159169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjn2w\" (UniqueName: \"kubernetes.io/projected/c5c8eac4-13f0-4f9e-a728-10480acdf927-kube-api-access-gjn2w\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.159839 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-utilities\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.159883 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-catalog-content\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.262558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjn2w\" (UniqueName: \"kubernetes.io/projected/c5c8eac4-13f0-4f9e-a728-10480acdf927-kube-api-access-gjn2w\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.262638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-utilities\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.262673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-catalog-content\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.263322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-catalog-content\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.264317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-utilities\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.272214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.292637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjn2w\" (UniqueName: \"kubernetes.io/projected/c5c8eac4-13f0-4f9e-a728-10480acdf927-kube-api-access-gjn2w\") pod \"community-operators-vk2wd\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.464396 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-utilities\") pod \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.464496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn8dj\" (UniqueName: \"kubernetes.io/projected/1e8a1436-4a73-4fa6-8c63-83e21c68896f-kube-api-access-tn8dj\") pod \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.464549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-catalog-content\") pod \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\" (UID: \"1e8a1436-4a73-4fa6-8c63-83e21c68896f\") " Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.466576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-utilities" (OuterVolumeSpecName: "utilities") pod "1e8a1436-4a73-4fa6-8c63-83e21c68896f" (UID: "1e8a1436-4a73-4fa6-8c63-83e21c68896f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.480377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8a1436-4a73-4fa6-8c63-83e21c68896f-kube-api-access-tn8dj" (OuterVolumeSpecName: "kube-api-access-tn8dj") pod "1e8a1436-4a73-4fa6-8c63-83e21c68896f" (UID: "1e8a1436-4a73-4fa6-8c63-83e21c68896f"). InnerVolumeSpecName "kube-api-access-tn8dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.489764 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e8a1436-4a73-4fa6-8c63-83e21c68896f" (UID: "1e8a1436-4a73-4fa6-8c63-83e21c68896f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.566460 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.566522 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn8dj\" (UniqueName: \"kubernetes.io/projected/1e8a1436-4a73-4fa6-8c63-83e21c68896f-kube-api-access-tn8dj\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.566538 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e8a1436-4a73-4fa6-8c63-83e21c68896f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.590891 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:06 crc kubenswrapper[4749]: I1129 01:24:06.803653 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vk2wd"] Nov 29 01:24:06 crc kubenswrapper[4749]: W1129 01:24:06.810588 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c8eac4_13f0_4f9e_a728_10480acdf927.slice/crio-27773cb77c947fc6e38ff34e95f80fd2048ea474e1bee0870ac1f3717a099121 WatchSource:0}: Error finding container 27773cb77c947fc6e38ff34e95f80fd2048ea474e1bee0870ac1f3717a099121: Status 404 returned error can't find the container with id 27773cb77c947fc6e38ff34e95f80fd2048ea474e1bee0870ac1f3717a099121 Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.008360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gws5t" event={"ID":"1e8a1436-4a73-4fa6-8c63-83e21c68896f","Type":"ContainerDied","Data":"36d23f7bd3ef9e2defc02509b98fcba06d38ec6ccd2ce24c5cda295617918615"} Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.008410 4749 scope.go:117] "RemoveContainer" containerID="13722be24a1de9f60c1776074ba6e3dabc398eeb37ddb2a4bb5effc369771834" Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.008475 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gws5t" Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.010377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk2wd" event={"ID":"c5c8eac4-13f0-4f9e-a728-10480acdf927","Type":"ContainerStarted","Data":"67cd7ed94cc3265a49a7ccb5fbd54ac4e5302808bd5f4bc663487c97cad619a2"} Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.010422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk2wd" event={"ID":"c5c8eac4-13f0-4f9e-a728-10480acdf927","Type":"ContainerStarted","Data":"27773cb77c947fc6e38ff34e95f80fd2048ea474e1bee0870ac1f3717a099121"} Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.053753 4749 scope.go:117] "RemoveContainer" containerID="6094afbe6d22fd1aa2fd53f5b738f82b724ae325e8a96017f295677eea23c7c4" Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.055825 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gws5t"] Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.061349 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gws5t"] Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.093760 4749 scope.go:117] "RemoveContainer" containerID="a12ac2ab5fbf906258ccf99929397fad58361eff1a9458e9853f062dde05d70c" Nov 29 01:24:07 crc kubenswrapper[4749]: I1129 01:24:07.096713 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" path="/var/lib/kubelet/pods/1e8a1436-4a73-4fa6-8c63-83e21c68896f/volumes" Nov 29 01:24:08 crc kubenswrapper[4749]: I1129 01:24:08.017346 4749 generic.go:334] "Generic (PLEG): container finished" podID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerID="67cd7ed94cc3265a49a7ccb5fbd54ac4e5302808bd5f4bc663487c97cad619a2" exitCode=0 Nov 29 01:24:08 crc kubenswrapper[4749]: I1129 01:24:08.017436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk2wd" event={"ID":"c5c8eac4-13f0-4f9e-a728-10480acdf927","Type":"ContainerDied","Data":"67cd7ed94cc3265a49a7ccb5fbd54ac4e5302808bd5f4bc663487c97cad619a2"} Nov 29 01:24:09 crc kubenswrapper[4749]: I1129 01:24:09.032790 4749 generic.go:334] "Generic (PLEG): container finished" podID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerID="a2ef377d8d09cf519dc80e8e409ae5b4c3f17b427f12724ae361ef67812f2d76" exitCode=0 Nov 29 01:24:09 crc kubenswrapper[4749]: I1129 01:24:09.032871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk2wd" event={"ID":"c5c8eac4-13f0-4f9e-a728-10480acdf927","Type":"ContainerDied","Data":"a2ef377d8d09cf519dc80e8e409ae5b4c3f17b427f12724ae361ef67812f2d76"} Nov 29 01:24:09 crc kubenswrapper[4749]: I1129 01:24:09.909766 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:24:09 crc kubenswrapper[4749]: I1129 01:24:09.909858 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:24:10 crc kubenswrapper[4749]: I1129 01:24:10.969770 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xqg5b" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="registry-server" probeResult="failure" output=< Nov 29 01:24:10 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 01:24:10 crc kubenswrapper[4749]: > Nov 29 01:24:11 crc kubenswrapper[4749]: I1129 01:24:11.054947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk2wd" event={"ID":"c5c8eac4-13f0-4f9e-a728-10480acdf927","Type":"ContainerStarted","Data":"5157691fab772616a706cb80c50e035add332cecf110f49002d203df5d2b2bfe"} Nov 29 01:24:11 crc kubenswrapper[4749]: I1129 01:24:11.080427 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vk2wd" podStartSLOduration=2.829833034 podStartE2EDuration="6.080404948s" podCreationTimestamp="2025-11-29 01:24:05 +0000 UTC" firstStartedPulling="2025-11-29 01:24:07.013555585 +0000 UTC m=+790.185705452" lastFinishedPulling="2025-11-29 01:24:10.264127509 +0000 UTC m=+793.436277366" observedRunningTime="2025-11-29 01:24:11.077095778 +0000 UTC m=+794.249245635" watchObservedRunningTime="2025-11-29 01:24:11.080404948 +0000 UTC m=+794.252554815" Nov 29 01:24:16 crc kubenswrapper[4749]: I1129 01:24:16.592049 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:16 crc kubenswrapper[4749]: I1129 01:24:16.593139 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:16 crc kubenswrapper[4749]: I1129 01:24:16.674745 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:17 crc kubenswrapper[4749]: I1129 01:24:17.156732 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:17 crc kubenswrapper[4749]: I1129 01:24:17.225717 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vk2wd"] Nov 29 01:24:19 crc kubenswrapper[4749]: I1129 01:24:19.118250 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vk2wd" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerName="registry-server" containerID="cri-o://5157691fab772616a706cb80c50e035add332cecf110f49002d203df5d2b2bfe" gracePeriod=2 Nov 29 01:24:19 crc kubenswrapper[4749]: I1129 01:24:19.999435 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.087354 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.130995 4749 generic.go:334] "Generic (PLEG): container finished" podID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerID="5157691fab772616a706cb80c50e035add332cecf110f49002d203df5d2b2bfe" exitCode=0 Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.131092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk2wd" event={"ID":"c5c8eac4-13f0-4f9e-a728-10480acdf927","Type":"ContainerDied","Data":"5157691fab772616a706cb80c50e035add332cecf110f49002d203df5d2b2bfe"} Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.173495 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.294991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-utilities\") pod \"c5c8eac4-13f0-4f9e-a728-10480acdf927\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.295190 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjn2w\" (UniqueName: \"kubernetes.io/projected/c5c8eac4-13f0-4f9e-a728-10480acdf927-kube-api-access-gjn2w\") pod \"c5c8eac4-13f0-4f9e-a728-10480acdf927\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.295255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-catalog-content\") pod \"c5c8eac4-13f0-4f9e-a728-10480acdf927\" (UID: \"c5c8eac4-13f0-4f9e-a728-10480acdf927\") " Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.296578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-utilities" (OuterVolumeSpecName: "utilities") pod "c5c8eac4-13f0-4f9e-a728-10480acdf927" (UID: "c5c8eac4-13f0-4f9e-a728-10480acdf927"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.304065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c8eac4-13f0-4f9e-a728-10480acdf927-kube-api-access-gjn2w" (OuterVolumeSpecName: "kube-api-access-gjn2w") pod "c5c8eac4-13f0-4f9e-a728-10480acdf927" (UID: "c5c8eac4-13f0-4f9e-a728-10480acdf927"). InnerVolumeSpecName "kube-api-access-gjn2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.362703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5c8eac4-13f0-4f9e-a728-10480acdf927" (UID: "c5c8eac4-13f0-4f9e-a728-10480acdf927"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.398388 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjn2w\" (UniqueName: \"kubernetes.io/projected/c5c8eac4-13f0-4f9e-a728-10480acdf927-kube-api-access-gjn2w\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.398441 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:20 crc kubenswrapper[4749]: I1129 01:24:20.398464 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5c8eac4-13f0-4f9e-a728-10480acdf927-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:21 crc kubenswrapper[4749]: I1129 01:24:21.145721 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vk2wd" event={"ID":"c5c8eac4-13f0-4f9e-a728-10480acdf927","Type":"ContainerDied","Data":"27773cb77c947fc6e38ff34e95f80fd2048ea474e1bee0870ac1f3717a099121"} Nov 29 01:24:21 crc kubenswrapper[4749]: I1129 01:24:21.145820 4749 scope.go:117] "RemoveContainer" containerID="5157691fab772616a706cb80c50e035add332cecf110f49002d203df5d2b2bfe" Nov 29 01:24:21 crc kubenswrapper[4749]: I1129 01:24:21.145925 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vk2wd" Nov 29 01:24:21 crc kubenswrapper[4749]: I1129 01:24:21.181592 4749 scope.go:117] "RemoveContainer" containerID="a2ef377d8d09cf519dc80e8e409ae5b4c3f17b427f12724ae361ef67812f2d76" Nov 29 01:24:21 crc kubenswrapper[4749]: I1129 01:24:21.188728 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vk2wd"] Nov 29 01:24:21 crc kubenswrapper[4749]: I1129 01:24:21.197322 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vk2wd"] Nov 29 01:24:21 crc kubenswrapper[4749]: I1129 01:24:21.205815 4749 scope.go:117] "RemoveContainer" containerID="67cd7ed94cc3265a49a7ccb5fbd54ac4e5302808bd5f4bc663487c97cad619a2" Nov 29 01:24:22 crc kubenswrapper[4749]: I1129 01:24:22.330012 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xqg5b"] Nov 29 01:24:22 crc kubenswrapper[4749]: I1129 01:24:22.331229 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xqg5b" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="registry-server" containerID="cri-o://777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867" gracePeriod=2 Nov 29 01:24:22 crc kubenswrapper[4749]: I1129 01:24:22.832554 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:24:22 crc kubenswrapper[4749]: I1129 01:24:22.945300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-catalog-content\") pod \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " Nov 29 01:24:22 crc kubenswrapper[4749]: I1129 01:24:22.945429 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcs8x\" (UniqueName: \"kubernetes.io/projected/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-kube-api-access-gcs8x\") pod \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " Nov 29 01:24:22 crc kubenswrapper[4749]: I1129 01:24:22.945488 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-utilities\") pod \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\" (UID: \"7dd237ae-d590-4a0d-81a7-afec1fb19bdb\") " Nov 29 01:24:22 crc kubenswrapper[4749]: I1129 01:24:22.946871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-utilities" (OuterVolumeSpecName: "utilities") pod "7dd237ae-d590-4a0d-81a7-afec1fb19bdb" (UID: "7dd237ae-d590-4a0d-81a7-afec1fb19bdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:24:22 crc kubenswrapper[4749]: I1129 01:24:22.953834 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-kube-api-access-gcs8x" (OuterVolumeSpecName: "kube-api-access-gcs8x") pod "7dd237ae-d590-4a0d-81a7-afec1fb19bdb" (UID: "7dd237ae-d590-4a0d-81a7-afec1fb19bdb"). InnerVolumeSpecName "kube-api-access-gcs8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.047547 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcs8x\" (UniqueName: \"kubernetes.io/projected/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-kube-api-access-gcs8x\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.048071 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.059767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dd237ae-d590-4a0d-81a7-afec1fb19bdb" (UID: "7dd237ae-d590-4a0d-81a7-afec1fb19bdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.085574 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" path="/var/lib/kubelet/pods/c5c8eac4-13f0-4f9e-a728-10480acdf927/volumes" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.149425 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd237ae-d590-4a0d-81a7-afec1fb19bdb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.169685 4749 generic.go:334] "Generic (PLEG): container finished" podID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerID="777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867" exitCode=0 Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.169764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqg5b" event={"ID":"7dd237ae-d590-4a0d-81a7-afec1fb19bdb","Type":"ContainerDied","Data":"777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867"} Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.169818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqg5b" event={"ID":"7dd237ae-d590-4a0d-81a7-afec1fb19bdb","Type":"ContainerDied","Data":"9a5292b2e5fce6423f0b3c7880891f171778a0ac1c250d6b9809357e553cfb08"} Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.169858 4749 scope.go:117] "RemoveContainer" containerID="777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.169882 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqg5b" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.204719 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xqg5b"] Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.215520 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xqg5b"] Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.220394 4749 scope.go:117] "RemoveContainer" containerID="c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.264574 4749 scope.go:117] "RemoveContainer" containerID="4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.298267 4749 scope.go:117] "RemoveContainer" containerID="777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867" Nov 29 01:24:23 crc kubenswrapper[4749]: E1129 01:24:23.299377 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867\": container with ID starting with 777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867 not found: ID does not exist" containerID="777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.299479 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867"} err="failed to get container status \"777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867\": rpc error: code = NotFound desc = could not find container \"777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867\": container with ID starting with 777efe9895e1c46a23255880aad07d3b81a182b3d8ef6f0ce74482c18aaf6867 not found: ID does not exist" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.299565 4749 scope.go:117] "RemoveContainer" containerID="c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48" Nov 29 01:24:23 crc kubenswrapper[4749]: E1129 01:24:23.300290 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48\": container with ID starting with c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48 not found: ID does not exist" containerID="c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.300369 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48"} err="failed to get container status \"c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48\": rpc error: code = NotFound desc = could not find container \"c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48\": container with ID starting with c353cce91c5037ef814208ab4bbbd580b5e8ed137743bc336b8f194d9a4b1a48 not found: ID does not exist" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.300426 4749 scope.go:117] "RemoveContainer" containerID="4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf" Nov 29 01:24:23 crc kubenswrapper[4749]: E1129 01:24:23.300921 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf\": container with ID starting with 4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf not found: ID does not exist" containerID="4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf" Nov 29 01:24:23 crc kubenswrapper[4749]: I1129 01:24:23.300953 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf"} err="failed to get container status \"4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf\": rpc error: code = NotFound desc = could not find container \"4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf\": container with ID starting with 4b5d6f3158618d72510c0b5e3881959c84a3f0a57b4c6dddd712ee7161e63abf not found: ID does not exist" Nov 29 01:24:25 crc kubenswrapper[4749]: I1129 01:24:25.091729 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" path="/var/lib/kubelet/pods/7dd237ae-d590-4a0d-81a7-afec1fb19bdb/volumes" Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.835396 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m7sg4"] Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.837428 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovn-controller" containerID="cri-o://218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72" gracePeriod=30 Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.838772 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovn-acl-logging" containerID="cri-o://d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0" gracePeriod=30 Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.838638 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="sbdb" containerID="cri-o://f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81" gracePeriod=30 Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.838713 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="northd" containerID="cri-o://d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2" gracePeriod=30 Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.838731 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f" gracePeriod=30 Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.838754 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kube-rbac-proxy-node" containerID="cri-o://f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e" gracePeriod=30 Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.838666 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="nbdb" containerID="cri-o://70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180" gracePeriod=30 Nov 29 01:24:42 crc kubenswrapper[4749]: I1129 01:24:42.902051 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" containerID="cri-o://7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" gracePeriod=30 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.221692 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/3.log" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.224323 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovn-acl-logging/0.log" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.225315 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovn-controller/0.log" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.225813 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289168 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nh9m6"] Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289424 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerName="extract-utilities" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289443 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerName="extract-utilities" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289457 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kube-rbac-proxy-node" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289467 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kube-rbac-proxy-node" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289483 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289492 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289503 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerName="extract-utilities" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289510 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerName="extract-utilities" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289521 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kubecfg-setup" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289528 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kubecfg-setup" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289539 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289550 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289560 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289567 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289575 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="extract-content" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289584 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="extract-content" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289597 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289607 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289621 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerName="extract-content" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289629 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerName="extract-content" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289638 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="northd" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289646 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="northd" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289659 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289666 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289677 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="nbdb" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289685 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="nbdb" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289696 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovn-acl-logging" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289704 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovn-acl-logging" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289722 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289735 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="extract-utilities" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289743 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="extract-utilities" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289754 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovn-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289762 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovn-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289770 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="sbdb" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289777 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="sbdb" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289789 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289798 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289807 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerName="extract-content" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289815 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerName="extract-content" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.289823 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289831 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289946 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="nbdb" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289958 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289966 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovn-acl-logging" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289977 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovn-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289986 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kube-rbac-proxy-node" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.289998 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="sbdb" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290006 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c8eac4-13f0-4f9e-a728-10480acdf927" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290012 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8a1436-4a73-4fa6-8c63-83e21c68896f" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290020 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290029 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290035 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="northd" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290043 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="kube-rbac-proxy-ovn-metrics" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290052 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290063 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd237ae-d590-4a0d-81a7-afec1fb19bdb" containerName="registry-server" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.290212 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290221 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.290322 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerName="ovnkube-controller" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.291941 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.358174 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovnkube-controller/3.log" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.361258 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovn-acl-logging/0.log" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.362274 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m7sg4_52d1a95a-c900-4842-82c4-5f4c37a16fee/ovn-controller/0.log" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.362902 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" exitCode=0 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363165 4749 scope.go:117] "RemoveContainer" containerID="7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363053 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81" exitCode=0 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363373 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180" exitCode=0 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363495 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2" exitCode=0 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363635 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f" exitCode=0 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363777 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e" exitCode=0 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363873 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0" exitCode=143 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363948 4749 generic.go:334] "Generic (PLEG): container finished" podID="52d1a95a-c900-4842-82c4-5f4c37a16fee" containerID="218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72" exitCode=143 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.363403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.362984 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364450 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364463 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364479 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364487 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364493 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364498 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364503 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364508 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364527 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364532 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364549 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364555 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364561 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364566 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364571 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364578 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364584 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364604 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364611 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364617 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364635 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364644 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364652 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364658 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364664 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364686 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364691 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364698 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364705 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364710 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m7sg4" event={"ID":"52d1a95a-c900-4842-82c4-5f4c37a16fee","Type":"ContainerDied","Data":"38e389a370f7a3bd521b153442d9ad90da8e6250ad222a568a1602b522522a3a"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364728 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364736 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364742 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364763 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364769 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364774 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364781 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364787 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364793 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.364798 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.367256 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/2.log" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.368009 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/1.log" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.368079 4749 generic.go:334] "Generic (PLEG): container finished" podID="454ec33e-9530-4cf0-ad08-9c3a21b0e56b" containerID="ddec8c7d8ebf8d0d087a1bdc6857aeb0504b9501b77508a43197f1f205864a99" exitCode=2 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.368129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7g" event={"ID":"454ec33e-9530-4cf0-ad08-9c3a21b0e56b","Type":"ContainerDied","Data":"ddec8c7d8ebf8d0d087a1bdc6857aeb0504b9501b77508a43197f1f205864a99"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.368166 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5"} Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.368874 4749 scope.go:117] "RemoveContainer" containerID="ddec8c7d8ebf8d0d087a1bdc6857aeb0504b9501b77508a43197f1f205864a99" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.398911 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-kubelet\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.398999 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-openvswitch\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399077 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovn-node-metrics-cert\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399115 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-netd\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-systemd-units\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399244 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-node-log\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-ovn-kubernetes\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399298 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-ovn\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399320 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-script-lib\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-etc-openvswitch\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399372 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-bin\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-config\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr2f7\" (UniqueName: \"kubernetes.io/projected/52d1a95a-c900-4842-82c4-5f4c37a16fee-kube-api-access-pr2f7\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399513 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-log-socket\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399554 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-slash\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-systemd\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-env-overrides\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-var-lib-openvswitch\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-netns\") pod \"52d1a95a-c900-4842-82c4-5f4c37a16fee\" (UID: \"52d1a95a-c900-4842-82c4-5f4c37a16fee\") " Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovnkube-config\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-cni-bin\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-run-netns\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovnkube-script-lib\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovn-node-metrics-cert\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-kubelet\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.399981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xc96\" (UniqueName: \"kubernetes.io/projected/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-kube-api-access-4xc96\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-var-lib-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400040 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-run-ovn-kubernetes\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-systemd\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-systemd-units\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400100 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-log-socket\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-cni-netd\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400135 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-env-overrides\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-ovn\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400176 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-slash\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-etc-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-node-log\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.400404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402419 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402441 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-node-log" (OuterVolumeSpecName: "node-log") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402460 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402495 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402480 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402878 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-slash" (OuterVolumeSpecName: "host-slash") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-log-socket" (OuterVolumeSpecName: "log-socket") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.402993 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.403527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.404271 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.404746 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.405168 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.407597 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.410622 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d1a95a-c900-4842-82c4-5f4c37a16fee-kube-api-access-pr2f7" (OuterVolumeSpecName: "kube-api-access-pr2f7") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "kube-api-access-pr2f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.423879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "52d1a95a-c900-4842-82c4-5f4c37a16fee" (UID: "52d1a95a-c900-4842-82c4-5f4c37a16fee"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.453929 4749 scope.go:117] "RemoveContainer" containerID="f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.476735 4749 scope.go:117] "RemoveContainer" containerID="70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.497469 4749 scope.go:117] "RemoveContainer" containerID="d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-ovn\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-slash\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-ovn\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-etc-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-node-log\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-slash\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovnkube-config\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-cni-bin\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501660 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-run-netns\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovnkube-script-lib\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovn-node-metrics-cert\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-kubelet\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xc96\" (UniqueName: \"kubernetes.io/projected/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-kube-api-access-4xc96\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-var-lib-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501891 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.501942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-run-ovn-kubernetes\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-systemd\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-systemd-units\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-log-socket\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502217 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-cni-netd\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-env-overrides\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502334 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502346 4749 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502372 4749 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502386 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502396 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502407 4749 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502418 4749 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-node-log\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502428 4749 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-var-lib-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-run-ovn-kubernetes\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-systemd\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-systemd-units\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovnkube-config\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-log-socket\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502658 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-etc-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-cni-netd\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-node-log\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-run-openvswitch\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-run-netns\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-cni-bin\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.502452 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-host-kubelet\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503324 4749 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-env-overrides\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503383 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503429 4749 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503452 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503471 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503494 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr2f7\" (UniqueName: \"kubernetes.io/projected/52d1a95a-c900-4842-82c4-5f4c37a16fee-kube-api-access-pr2f7\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503516 4749 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-log-socket\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503540 4749 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-host-slash\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503569 4749 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503586 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52d1a95a-c900-4842-82c4-5f4c37a16fee-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503614 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52d1a95a-c900-4842-82c4-5f4c37a16fee-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.503639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovnkube-script-lib\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.508252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-ovn-node-metrics-cert\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.533932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xc96\" (UniqueName: \"kubernetes.io/projected/8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996-kube-api-access-4xc96\") pod \"ovnkube-node-nh9m6\" (UID: \"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996\") " pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.534268 4749 scope.go:117] "RemoveContainer" containerID="6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.552154 4749 scope.go:117] "RemoveContainer" containerID="f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.570636 4749 scope.go:117] "RemoveContainer" containerID="d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.593386 4749 scope.go:117] "RemoveContainer" containerID="218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.610836 4749 scope.go:117] "RemoveContainer" containerID="21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.620182 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.628417 4749 scope.go:117] "RemoveContainer" containerID="7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.629121 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": container with ID starting with 7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc not found: ID does not exist" containerID="7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.629172 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} err="failed to get container status \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": rpc error: code = NotFound desc = could not find container \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": container with ID starting with 7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.629229 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.629726 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": container with ID starting with dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8 not found: ID does not exist" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.629820 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} err="failed to get container status \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": rpc error: code = NotFound desc = could not find container \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": container with ID starting with dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.629890 4749 scope.go:117] "RemoveContainer" containerID="f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.630487 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": container with ID starting with f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81 not found: ID does not exist" containerID="f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.630522 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} err="failed to get container status \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": rpc error: code = NotFound desc = could not find container \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": container with ID starting with f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.630541 4749 scope.go:117] "RemoveContainer" containerID="70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.631100 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": container with ID starting with 70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180 not found: ID does not exist" containerID="70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.631162 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} err="failed to get container status \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": rpc error: code = NotFound desc = could not find container \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": container with ID starting with 70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.631228 4749 scope.go:117] "RemoveContainer" containerID="d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.631836 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": container with ID starting with d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2 not found: ID does not exist" containerID="d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.631927 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} err="failed to get container status \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": rpc error: code = NotFound desc = could not find container \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": container with ID starting with d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.631984 4749 scope.go:117] "RemoveContainer" containerID="6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.633350 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": container with ID starting with 6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f not found: ID does not exist" containerID="6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.633392 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} err="failed to get container status \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": rpc error: code = NotFound desc = could not find container \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": container with ID starting with 6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.633410 4749 scope.go:117] "RemoveContainer" containerID="f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.634828 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": container with ID starting with f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e not found: ID does not exist" containerID="f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.634866 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} err="failed to get container status \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": rpc error: code = NotFound desc = could not find container \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": container with ID starting with f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.634890 4749 scope.go:117] "RemoveContainer" containerID="d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.635282 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": container with ID starting with d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0 not found: ID does not exist" containerID="d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.635307 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} err="failed to get container status \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": rpc error: code = NotFound desc = could not find container \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": container with ID starting with d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.635325 4749 scope.go:117] "RemoveContainer" containerID="218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.635919 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": container with ID starting with 218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72 not found: ID does not exist" containerID="218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.635949 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} err="failed to get container status \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": rpc error: code = NotFound desc = could not find container \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": container with ID starting with 218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.635966 4749 scope.go:117] "RemoveContainer" containerID="21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4" Nov 29 01:24:43 crc kubenswrapper[4749]: E1129 01:24:43.637239 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": container with ID starting with 21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4 not found: ID does not exist" containerID="21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.637265 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} err="failed to get container status \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": rpc error: code = NotFound desc = could not find container \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": container with ID starting with 21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.637281 4749 scope.go:117] "RemoveContainer" containerID="7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.637563 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} err="failed to get container status \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": rpc error: code = NotFound desc = could not find container \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": container with ID starting with 7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.637584 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.638056 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} err="failed to get container status \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": rpc error: code = NotFound desc = could not find container \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": container with ID starting with dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.638078 4749 scope.go:117] "RemoveContainer" containerID="f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.638452 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} err="failed to get container status \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": rpc error: code = NotFound desc = could not find container \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": container with ID starting with f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.638483 4749 scope.go:117] "RemoveContainer" containerID="70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.638916 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} err="failed to get container status \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": rpc error: code = NotFound desc = could not find container \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": container with ID starting with 70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.638936 4749 scope.go:117] "RemoveContainer" containerID="d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.639309 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} err="failed to get container status \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": rpc error: code = NotFound desc = could not find container \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": container with ID starting with d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.639331 4749 scope.go:117] "RemoveContainer" containerID="6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.639695 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} err="failed to get container status \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": rpc error: code = NotFound desc = could not find container \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": container with ID starting with 6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.639726 4749 scope.go:117] "RemoveContainer" containerID="f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.640087 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} err="failed to get container status \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": rpc error: code = NotFound desc = could not find container \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": container with ID starting with f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.640107 4749 scope.go:117] "RemoveContainer" containerID="d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.640439 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} err="failed to get container status \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": rpc error: code = NotFound desc = could not find container \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": container with ID starting with d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.640460 4749 scope.go:117] "RemoveContainer" containerID="218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.640826 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} err="failed to get container status \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": rpc error: code = NotFound desc = could not find container \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": container with ID starting with 218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.640848 4749 scope.go:117] "RemoveContainer" containerID="21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.641170 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} err="failed to get container status \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": rpc error: code = NotFound desc = could not find container \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": container with ID starting with 21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.641191 4749 scope.go:117] "RemoveContainer" containerID="7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.641535 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} err="failed to get container status \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": rpc error: code = NotFound desc = could not find container \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": container with ID starting with 7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.641558 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.641940 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} err="failed to get container status \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": rpc error: code = NotFound desc = could not find container \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": container with ID starting with dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.641960 4749 scope.go:117] "RemoveContainer" containerID="f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.642307 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} err="failed to get container status \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": rpc error: code = NotFound desc = could not find container \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": container with ID starting with f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.642343 4749 scope.go:117] "RemoveContainer" containerID="70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.642739 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} err="failed to get container status \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": rpc error: code = NotFound desc = could not find container \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": container with ID starting with 70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.642761 4749 scope.go:117] "RemoveContainer" containerID="d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.643115 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} err="failed to get container status \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": rpc error: code = NotFound desc = could not find container \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": container with ID starting with d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.643139 4749 scope.go:117] "RemoveContainer" containerID="6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.644683 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} err="failed to get container status \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": rpc error: code = NotFound desc = could not find container \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": container with ID starting with 6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.644713 4749 scope.go:117] "RemoveContainer" containerID="f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.645131 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} err="failed to get container status \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": rpc error: code = NotFound desc = could not find container \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": container with ID starting with f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.645151 4749 scope.go:117] "RemoveContainer" containerID="d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.645387 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} err="failed to get container status \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": rpc error: code = NotFound desc = could not find container \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": container with ID starting with d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.645408 4749 scope.go:117] "RemoveContainer" containerID="218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.645612 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} err="failed to get container status \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": rpc error: code = NotFound desc = could not find container \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": container with ID starting with 218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.645630 4749 scope.go:117] "RemoveContainer" containerID="21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.645840 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} err="failed to get container status \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": rpc error: code = NotFound desc = could not find container \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": container with ID starting with 21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.645867 4749 scope.go:117] "RemoveContainer" containerID="7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.646104 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} err="failed to get container status \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": rpc error: code = NotFound desc = could not find container \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": container with ID starting with 7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.646125 4749 scope.go:117] "RemoveContainer" containerID="dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.646335 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8"} err="failed to get container status \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": rpc error: code = NotFound desc = could not find container \"dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8\": container with ID starting with dcf1dfd595a46c306057b8c4f4bc1e6457ebc57f1312288d138c3d54d9f7cff8 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.646354 4749 scope.go:117] "RemoveContainer" containerID="f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.646543 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81"} err="failed to get container status \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": rpc error: code = NotFound desc = could not find container \"f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81\": container with ID starting with f0ec3c364839d4ccf31db54ef559283af2c7647b0972315533bd2800bf965d81 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.646562 4749 scope.go:117] "RemoveContainer" containerID="70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.646779 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180"} err="failed to get container status \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": rpc error: code = NotFound desc = could not find container \"70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180\": container with ID starting with 70ccc2f3d9592b905bb8e9142dada2ac99f2303c8cba1435c176de5eb635d180 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.646797 4749 scope.go:117] "RemoveContainer" containerID="d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.647008 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2"} err="failed to get container status \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": rpc error: code = NotFound desc = could not find container \"d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2\": container with ID starting with d104b3ac140b1f2accaac69d91d2d0150889c2fae7864e226f9e5759c2477ee2 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.647034 4749 scope.go:117] "RemoveContainer" containerID="6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.647572 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f"} err="failed to get container status \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": rpc error: code = NotFound desc = could not find container \"6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f\": container with ID starting with 6b1019f84363f73c2880ac06c8195370bdd6d262b2a04efea176818a3cfd5a2f not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.647604 4749 scope.go:117] "RemoveContainer" containerID="f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.647802 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e"} err="failed to get container status \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": rpc error: code = NotFound desc = could not find container \"f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e\": container with ID starting with f2d4887911362cf4416ee4ccee39d2e55a016d3479bb15956e440becbeb2893e not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.647822 4749 scope.go:117] "RemoveContainer" containerID="d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.648012 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0"} err="failed to get container status \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": rpc error: code = NotFound desc = could not find container \"d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0\": container with ID starting with d88e40a29f21c41e8e585108499a19a576b6840c3d2a079f4f45c972a3d541a0 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.648031 4749 scope.go:117] "RemoveContainer" containerID="218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.648242 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72"} err="failed to get container status \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": rpc error: code = NotFound desc = could not find container \"218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72\": container with ID starting with 218d13a9b29e34398e31beef2ed1e702b98dcd69834eb566a34c286f50bb7a72 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.648263 4749 scope.go:117] "RemoveContainer" containerID="21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.648451 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4"} err="failed to get container status \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": rpc error: code = NotFound desc = could not find container \"21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4\": container with ID starting with 21eeffee03e9344cc21ff0ab145c82a10e59846617ffa41e31d2d237e704f4b4 not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.648470 4749 scope.go:117] "RemoveContainer" containerID="7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc" Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.648652 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc"} err="failed to get container status \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": rpc error: code = NotFound desc = could not find container \"7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc\": container with ID starting with 7ae63776e7cb5c0efef7fe771c8feaf6b6243a144db6adb4314ef9de3a8057dc not found: ID does not exist" Nov 29 01:24:43 crc kubenswrapper[4749]: W1129 01:24:43.658040 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eecef6f_f9ee_4b09_bd8b_a4f84bd1c996.slice/crio-7ecb89d5d62bef2ed43a2689a1e32714c63c955c9ec7b04227b8194354d88270 WatchSource:0}: Error finding container 7ecb89d5d62bef2ed43a2689a1e32714c63c955c9ec7b04227b8194354d88270: Status 404 returned error can't find the container with id 7ecb89d5d62bef2ed43a2689a1e32714c63c955c9ec7b04227b8194354d88270 Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.704089 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m7sg4"] Nov 29 01:24:43 crc kubenswrapper[4749]: I1129 01:24:43.713570 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m7sg4"] Nov 29 01:24:44 crc kubenswrapper[4749]: I1129 01:24:44.378695 4749 generic.go:334] "Generic (PLEG): container finished" podID="8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996" containerID="b2090fbcd1861554bbf892ff70f9befc6c322dda46753a9c59af64f2e049e709" exitCode=0 Nov 29 01:24:44 crc kubenswrapper[4749]: I1129 01:24:44.378808 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerDied","Data":"b2090fbcd1861554bbf892ff70f9befc6c322dda46753a9c59af64f2e049e709"} Nov 29 01:24:44 crc kubenswrapper[4749]: I1129 01:24:44.379340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"7ecb89d5d62bef2ed43a2689a1e32714c63c955c9ec7b04227b8194354d88270"} Nov 29 01:24:44 crc kubenswrapper[4749]: I1129 01:24:44.392155 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/2.log" Nov 29 01:24:44 crc kubenswrapper[4749]: I1129 01:24:44.393475 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/1.log" Nov 29 01:24:44 crc kubenswrapper[4749]: I1129 01:24:44.393706 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7g" event={"ID":"454ec33e-9530-4cf0-ad08-9c3a21b0e56b","Type":"ContainerStarted","Data":"a460e8d595ab2d8b1e232f79bcf1e8f409bfea0ad407343e1a7e5958ca02df70"} Nov 29 01:24:45 crc kubenswrapper[4749]: I1129 01:24:45.085449 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d1a95a-c900-4842-82c4-5f4c37a16fee" path="/var/lib/kubelet/pods/52d1a95a-c900-4842-82c4-5f4c37a16fee/volumes" Nov 29 01:24:45 crc kubenswrapper[4749]: I1129 01:24:45.412064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"167916844dae94bd8457cb44b7d3beda287ee09b018c3d43ce17a652c715293a"} Nov 29 01:24:45 crc kubenswrapper[4749]: I1129 01:24:45.412177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"c2e9378668a4f26450bee83688735481d987816f47919947b45cb94b0f554977"} Nov 29 01:24:45 crc kubenswrapper[4749]: I1129 01:24:45.412264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"744abf2008fcac710e61701263b590f693e33311e7fe6dbe0e1e074102a2ce96"} Nov 29 01:24:45 crc kubenswrapper[4749]: I1129 01:24:45.412283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"f360a03934e59bf1406bacabe86d0b19f2d6472c8188fe78bd521ec8b986e925"} Nov 29 01:24:45 crc kubenswrapper[4749]: I1129 01:24:45.412299 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"934fe1514900ba91034f096b3a03107508e058cb8d814ed39d60af36e8b17572"} Nov 29 01:24:45 crc kubenswrapper[4749]: I1129 01:24:45.412316 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"4c61d62347a4b817286cf4f62716b90bb75686bb295a6e0f3535492cab9a27d6"} Nov 29 01:24:48 crc kubenswrapper[4749]: I1129 01:24:48.443511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"438d7964c56a6592dc8fbd3adecf0db5eb3bc98e0e2a30e993ba04533dbcfe65"} Nov 29 01:24:51 crc kubenswrapper[4749]: I1129 01:24:51.480128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" event={"ID":"8eecef6f-f9ee-4b09-bd8b-a4f84bd1c996","Type":"ContainerStarted","Data":"7e0bae03463a5d995cd72bda9d62d81e283f0ed84a6f3133057528640b94dcc4"} Nov 29 01:24:51 crc kubenswrapper[4749]: I1129 01:24:51.481370 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:51 crc kubenswrapper[4749]: I1129 01:24:51.527824 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" podStartSLOduration=8.527788391 podStartE2EDuration="8.527788391s" podCreationTimestamp="2025-11-29 01:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:24:51.521252943 +0000 UTC m=+834.693402860" watchObservedRunningTime="2025-11-29 01:24:51.527788391 +0000 UTC m=+834.699938288" Nov 29 01:24:51 crc kubenswrapper[4749]: I1129 01:24:51.537547 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.488124 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.488703 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.527089 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.577256 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ddv87"] Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.578265 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.596075 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.596189 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.596443 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.596945 4749 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-cnnk7" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.599048 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ddv87"] Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.781880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8lqv\" (UniqueName: \"kubernetes.io/projected/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-kube-api-access-q8lqv\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.782007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-node-mnt\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.782051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-crc-storage\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.883009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-node-mnt\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.883094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-crc-storage\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.883216 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8lqv\" (UniqueName: \"kubernetes.io/projected/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-kube-api-access-q8lqv\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.883585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-node-mnt\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.884234 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-crc-storage\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:52 crc kubenswrapper[4749]: I1129 01:24:52.922644 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8lqv\" (UniqueName: \"kubernetes.io/projected/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-kube-api-access-q8lqv\") pod \"crc-storage-crc-ddv87\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:53 crc kubenswrapper[4749]: I1129 01:24:53.221374 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:53 crc kubenswrapper[4749]: E1129 01:24:53.269878 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ddv87_crc-storage_fb8d4b7b-6cf6-48c5-89d9-595c9b835306_0(b369faad2789120e990c994ec423184a1e5c424049fc22829f51f30772578cdb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 01:24:53 crc kubenswrapper[4749]: E1129 01:24:53.270433 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ddv87_crc-storage_fb8d4b7b-6cf6-48c5-89d9-595c9b835306_0(b369faad2789120e990c994ec423184a1e5c424049fc22829f51f30772578cdb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:53 crc kubenswrapper[4749]: E1129 01:24:53.270500 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ddv87_crc-storage_fb8d4b7b-6cf6-48c5-89d9-595c9b835306_0(b369faad2789120e990c994ec423184a1e5c424049fc22829f51f30772578cdb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:53 crc kubenswrapper[4749]: E1129 01:24:53.270592 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ddv87_crc-storage(fb8d4b7b-6cf6-48c5-89d9-595c9b835306)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ddv87_crc-storage(fb8d4b7b-6cf6-48c5-89d9-595c9b835306)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ddv87_crc-storage_fb8d4b7b-6cf6-48c5-89d9-595c9b835306_0(b369faad2789120e990c994ec423184a1e5c424049fc22829f51f30772578cdb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ddv87" podUID="fb8d4b7b-6cf6-48c5-89d9-595c9b835306" Nov 29 01:24:53 crc kubenswrapper[4749]: I1129 01:24:53.497436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:53 crc kubenswrapper[4749]: I1129 01:24:53.498554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:53 crc kubenswrapper[4749]: E1129 01:24:53.530347 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ddv87_crc-storage_fb8d4b7b-6cf6-48c5-89d9-595c9b835306_0(7bf1caffc44f9d62cb66ccc2ce5bed3cef480e2c9852c1b95eea5fbd49200853): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 29 01:24:53 crc kubenswrapper[4749]: E1129 01:24:53.530483 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ddv87_crc-storage_fb8d4b7b-6cf6-48c5-89d9-595c9b835306_0(7bf1caffc44f9d62cb66ccc2ce5bed3cef480e2c9852c1b95eea5fbd49200853): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:53 crc kubenswrapper[4749]: E1129 01:24:53.530559 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ddv87_crc-storage_fb8d4b7b-6cf6-48c5-89d9-595c9b835306_0(7bf1caffc44f9d62cb66ccc2ce5bed3cef480e2c9852c1b95eea5fbd49200853): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:24:53 crc kubenswrapper[4749]: E1129 01:24:53.530704 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-ddv87_crc-storage(fb8d4b7b-6cf6-48c5-89d9-595c9b835306)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-ddv87_crc-storage(fb8d4b7b-6cf6-48c5-89d9-595c9b835306)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-ddv87_crc-storage_fb8d4b7b-6cf6-48c5-89d9-595c9b835306_0(7bf1caffc44f9d62cb66ccc2ce5bed3cef480e2c9852c1b95eea5fbd49200853): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-ddv87" podUID="fb8d4b7b-6cf6-48c5-89d9-595c9b835306" Nov 29 01:24:57 crc kubenswrapper[4749]: I1129 01:24:57.460799 4749 scope.go:117] "RemoveContainer" containerID="3dd8be97625a3fb9303d77d35776906e307b02a50c362c022a5e7386f51b54f5" Nov 29 01:24:57 crc kubenswrapper[4749]: I1129 01:24:57.543999 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7g_454ec33e-9530-4cf0-ad08-9c3a21b0e56b/kube-multus/2.log" Nov 29 01:25:08 crc kubenswrapper[4749]: I1129 01:25:08.075052 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:25:08 crc kubenswrapper[4749]: I1129 01:25:08.076576 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:25:08 crc kubenswrapper[4749]: I1129 01:25:08.327952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ddv87"] Nov 29 01:25:08 crc kubenswrapper[4749]: I1129 01:25:08.632167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ddv87" event={"ID":"fb8d4b7b-6cf6-48c5-89d9-595c9b835306","Type":"ContainerStarted","Data":"4103eacb7ff5e677ed48b115e26289fd1c03d3f400414f0b4065c7c2a923d909"} Nov 29 01:25:10 crc kubenswrapper[4749]: I1129 01:25:10.651571 4749 generic.go:334] "Generic (PLEG): container finished" podID="fb8d4b7b-6cf6-48c5-89d9-595c9b835306" containerID="fe6938cd9b4daee386b0b15740a94067610d45d768906bf777a57e67ae1e4756" exitCode=0 Nov 29 01:25:10 crc kubenswrapper[4749]: I1129 01:25:10.652656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ddv87" event={"ID":"fb8d4b7b-6cf6-48c5-89d9-595c9b835306","Type":"ContainerDied","Data":"fe6938cd9b4daee386b0b15740a94067610d45d768906bf777a57e67ae1e4756"} Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.019471 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.221536 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8lqv\" (UniqueName: \"kubernetes.io/projected/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-kube-api-access-q8lqv\") pod \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.221648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-node-mnt\") pod \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.221910 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-crc-storage\") pod \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\" (UID: \"fb8d4b7b-6cf6-48c5-89d9-595c9b835306\") " Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.222291 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fb8d4b7b-6cf6-48c5-89d9-595c9b835306" (UID: "fb8d4b7b-6cf6-48c5-89d9-595c9b835306"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.235557 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-kube-api-access-q8lqv" (OuterVolumeSpecName: "kube-api-access-q8lqv") pod "fb8d4b7b-6cf6-48c5-89d9-595c9b835306" (UID: "fb8d4b7b-6cf6-48c5-89d9-595c9b835306"). InnerVolumeSpecName "kube-api-access-q8lqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.256145 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fb8d4b7b-6cf6-48c5-89d9-595c9b835306" (UID: "fb8d4b7b-6cf6-48c5-89d9-595c9b835306"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.324659 4749 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.324719 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8lqv\" (UniqueName: \"kubernetes.io/projected/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-kube-api-access-q8lqv\") on node \"crc\" DevicePath \"\"" Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.324744 4749 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fb8d4b7b-6cf6-48c5-89d9-595c9b835306-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.673108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ddv87" event={"ID":"fb8d4b7b-6cf6-48c5-89d9-595c9b835306","Type":"ContainerDied","Data":"4103eacb7ff5e677ed48b115e26289fd1c03d3f400414f0b4065c7c2a923d909"} Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.673177 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4103eacb7ff5e677ed48b115e26289fd1c03d3f400414f0b4065c7c2a923d909" Nov 29 01:25:12 crc kubenswrapper[4749]: I1129 01:25:12.673400 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ddv87" Nov 29 01:25:13 crc kubenswrapper[4749]: I1129 01:25:13.662102 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nh9m6" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.460026 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s"] Nov 29 01:25:20 crc kubenswrapper[4749]: E1129 01:25:20.461856 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8d4b7b-6cf6-48c5-89d9-595c9b835306" containerName="storage" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.461892 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8d4b7b-6cf6-48c5-89d9-595c9b835306" containerName="storage" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.462106 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8d4b7b-6cf6-48c5-89d9-595c9b835306" containerName="storage" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.463571 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.465862 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.471679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s"] Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.561534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.561599 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddb4g\" (UniqueName: \"kubernetes.io/projected/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-kube-api-access-ddb4g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.561626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.663312 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.663493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.663556 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddb4g\" (UniqueName: \"kubernetes.io/projected/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-kube-api-access-ddb4g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.664563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.664744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.693986 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddb4g\" (UniqueName: \"kubernetes.io/projected/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-kube-api-access-ddb4g\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:20 crc kubenswrapper[4749]: I1129 01:25:20.798349 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:21 crc kubenswrapper[4749]: I1129 01:25:21.095952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s"] Nov 29 01:25:21 crc kubenswrapper[4749]: I1129 01:25:21.801103 4749 generic.go:334] "Generic (PLEG): container finished" podID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerID="cd1619c66cd626d3adfe2a873e6c15f8086fccd5375e7920c5930a7e856e662b" exitCode=0 Nov 29 01:25:21 crc kubenswrapper[4749]: I1129 01:25:21.801739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" event={"ID":"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b","Type":"ContainerDied","Data":"cd1619c66cd626d3adfe2a873e6c15f8086fccd5375e7920c5930a7e856e662b"} Nov 29 01:25:21 crc kubenswrapper[4749]: I1129 01:25:21.801801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" event={"ID":"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b","Type":"ContainerStarted","Data":"ebad66b1a416ca43f4275dc2ba008a30794ef076b859cb1125503be2d2cca566"} Nov 29 01:25:23 crc kubenswrapper[4749]: I1129 01:25:23.821427 4749 generic.go:334] "Generic (PLEG): container finished" podID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerID="aaf03c78a169b969268dc695332bd21c34594cce05530ffe7f01530fbfc22a8e" exitCode=0 Nov 29 01:25:23 crc kubenswrapper[4749]: I1129 01:25:23.821504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" event={"ID":"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b","Type":"ContainerDied","Data":"aaf03c78a169b969268dc695332bd21c34594cce05530ffe7f01530fbfc22a8e"} Nov 29 01:25:24 crc kubenswrapper[4749]: I1129 01:25:24.833200 4749 generic.go:334] "Generic (PLEG): container finished" podID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerID="2efdceece0755a09d8534fe736e6d2eac75b195bf948f989d49e8b99c77e06da" exitCode=0 Nov 29 01:25:24 crc kubenswrapper[4749]: I1129 01:25:24.833346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" event={"ID":"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b","Type":"ContainerDied","Data":"2efdceece0755a09d8534fe736e6d2eac75b195bf948f989d49e8b99c77e06da"} Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.149806 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.264259 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddb4g\" (UniqueName: \"kubernetes.io/projected/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-kube-api-access-ddb4g\") pod \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.264316 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-bundle\") pod \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.264471 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-util\") pod \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\" (UID: \"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b\") " Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.266188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-bundle" (OuterVolumeSpecName: "bundle") pod "994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" (UID: "994ad8e8-6c08-4674-a8dd-715d8c8f1e5b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.280262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-util" (OuterVolumeSpecName: "util") pod "994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" (UID: "994ad8e8-6c08-4674-a8dd-715d8c8f1e5b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.286932 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-kube-api-access-ddb4g" (OuterVolumeSpecName: "kube-api-access-ddb4g") pod "994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" (UID: "994ad8e8-6c08-4674-a8dd-715d8c8f1e5b"). InnerVolumeSpecName "kube-api-access-ddb4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.365751 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddb4g\" (UniqueName: \"kubernetes.io/projected/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-kube-api-access-ddb4g\") on node \"crc\" DevicePath \"\"" Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.366170 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.366267 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/994ad8e8-6c08-4674-a8dd-715d8c8f1e5b-util\") on node \"crc\" DevicePath \"\"" Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.850594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" event={"ID":"994ad8e8-6c08-4674-a8dd-715d8c8f1e5b","Type":"ContainerDied","Data":"ebad66b1a416ca43f4275dc2ba008a30794ef076b859cb1125503be2d2cca566"} Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.851381 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebad66b1a416ca43f4275dc2ba008a30794ef076b859cb1125503be2d2cca566" Nov 29 01:25:26 crc kubenswrapper[4749]: I1129 01:25:26.850958 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.041648 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2"] Nov 29 01:25:32 crc kubenswrapper[4749]: E1129 01:25:32.042553 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerName="util" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.042576 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerName="util" Nov 29 01:25:32 crc kubenswrapper[4749]: E1129 01:25:32.042597 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerName="extract" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.042606 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerName="extract" Nov 29 01:25:32 crc kubenswrapper[4749]: E1129 01:25:32.042622 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerName="pull" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.042632 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerName="pull" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.042764 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="994ad8e8-6c08-4674-a8dd-715d8c8f1e5b" containerName="extract" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.043408 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.046153 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.046558 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.046869 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bjtps" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.057483 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2"] Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.175953 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwbj2\" (UniqueName: \"kubernetes.io/projected/21426103-df8a-47f8-ac5d-60c843e56c3d-kube-api-access-gwbj2\") pod \"nmstate-operator-5b5b58f5c8-dskn2\" (UID: \"21426103-df8a-47f8-ac5d-60c843e56c3d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.277682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwbj2\" (UniqueName: \"kubernetes.io/projected/21426103-df8a-47f8-ac5d-60c843e56c3d-kube-api-access-gwbj2\") pod \"nmstate-operator-5b5b58f5c8-dskn2\" (UID: \"21426103-df8a-47f8-ac5d-60c843e56c3d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.320854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwbj2\" (UniqueName: \"kubernetes.io/projected/21426103-df8a-47f8-ac5d-60c843e56c3d-kube-api-access-gwbj2\") pod \"nmstate-operator-5b5b58f5c8-dskn2\" (UID: \"21426103-df8a-47f8-ac5d-60c843e56c3d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.371371 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2" Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.774397 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2"] Nov 29 01:25:32 crc kubenswrapper[4749]: I1129 01:25:32.897071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2" event={"ID":"21426103-df8a-47f8-ac5d-60c843e56c3d","Type":"ContainerStarted","Data":"cc72f3d0e48f2d96f534e07ecad4e16a5eaf3c0bf0e3801dac329050e5ce9686"} Nov 29 01:25:35 crc kubenswrapper[4749]: I1129 01:25:35.953493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2" event={"ID":"21426103-df8a-47f8-ac5d-60c843e56c3d","Type":"ContainerStarted","Data":"5faacae4da3dca947e3c1cde7580c77644dbd01c439dddc388b1188ead60edf7"} Nov 29 01:25:35 crc kubenswrapper[4749]: I1129 01:25:35.988158 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-dskn2" podStartSLOduration=1.573223627 podStartE2EDuration="3.988126569s" podCreationTimestamp="2025-11-29 01:25:32 +0000 UTC" firstStartedPulling="2025-11-29 01:25:32.788519381 +0000 UTC m=+875.960669238" lastFinishedPulling="2025-11-29 01:25:35.203422323 +0000 UTC m=+878.375572180" observedRunningTime="2025-11-29 01:25:35.980347579 +0000 UTC m=+879.152497466" watchObservedRunningTime="2025-11-29 01:25:35.988126569 +0000 UTC m=+879.160276436" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.196246 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.199354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.203493 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-tvtgc" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.212818 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.229247 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.231050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.232930 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.237463 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-h75fj"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.238578 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.265393 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.322973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7fsm\" (UniqueName: \"kubernetes.io/projected/e66f7428-ba60-42fc-92b6-b45c1d974b8b-kube-api-access-l7fsm\") pod \"nmstate-webhook-5f6d4c5ccb-7jn4q\" (UID: \"e66f7428-ba60-42fc-92b6-b45c1d974b8b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.323054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e66f7428-ba60-42fc-92b6-b45c1d974b8b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7jn4q\" (UID: \"e66f7428-ba60-42fc-92b6-b45c1d974b8b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.323082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-dbus-socket\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.323103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-ovs-socket\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.323138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8t9b\" (UniqueName: \"kubernetes.io/projected/0b80b7b3-a0dc-488b-9431-2016284ab8af-kube-api-access-h8t9b\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.323155 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj4b4\" (UniqueName: \"kubernetes.io/projected/50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe-kube-api-access-xj4b4\") pod \"nmstate-metrics-7f946cbc9-qsmwj\" (UID: \"50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.323176 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-nmstate-lock\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.341593 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.342796 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.345804 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.345978 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-p4gdw" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.346113 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.364105 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424237 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02baa9c1-affa-4d66-afac-5c8bd20bf097-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e66f7428-ba60-42fc-92b6-b45c1d974b8b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7jn4q\" (UID: \"e66f7428-ba60-42fc-92b6-b45c1d974b8b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-dbus-socket\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6b44\" (UniqueName: \"kubernetes.io/projected/02baa9c1-affa-4d66-afac-5c8bd20bf097-kube-api-access-p6b44\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-ovs-socket\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: E1129 01:25:41.424494 4749 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Nov 29 01:25:41 crc kubenswrapper[4749]: E1129 01:25:41.424715 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e66f7428-ba60-42fc-92b6-b45c1d974b8b-tls-key-pair podName:e66f7428-ba60-42fc-92b6-b45c1d974b8b nodeName:}" failed. No retries permitted until 2025-11-29 01:25:41.924681234 +0000 UTC m=+885.096831091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e66f7428-ba60-42fc-92b6-b45c1d974b8b-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-7jn4q" (UID: "e66f7428-ba60-42fc-92b6-b45c1d974b8b") : secret "openshift-nmstate-webhook" not found Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-ovs-socket\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8t9b\" (UniqueName: \"kubernetes.io/projected/0b80b7b3-a0dc-488b-9431-2016284ab8af-kube-api-access-h8t9b\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424883 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-dbus-socket\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj4b4\" (UniqueName: \"kubernetes.io/projected/50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe-kube-api-access-xj4b4\") pod \"nmstate-metrics-7f946cbc9-qsmwj\" (UID: \"50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.424977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02baa9c1-affa-4d66-afac-5c8bd20bf097-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.425054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-nmstate-lock\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.425185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7fsm\" (UniqueName: \"kubernetes.io/projected/e66f7428-ba60-42fc-92b6-b45c1d974b8b-kube-api-access-l7fsm\") pod \"nmstate-webhook-5f6d4c5ccb-7jn4q\" (UID: \"e66f7428-ba60-42fc-92b6-b45c1d974b8b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.425218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0b80b7b3-a0dc-488b-9431-2016284ab8af-nmstate-lock\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.455791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7fsm\" (UniqueName: \"kubernetes.io/projected/e66f7428-ba60-42fc-92b6-b45c1d974b8b-kube-api-access-l7fsm\") pod \"nmstate-webhook-5f6d4c5ccb-7jn4q\" (UID: \"e66f7428-ba60-42fc-92b6-b45c1d974b8b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.466904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj4b4\" (UniqueName: \"kubernetes.io/projected/50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe-kube-api-access-xj4b4\") pod \"nmstate-metrics-7f946cbc9-qsmwj\" (UID: \"50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.467691 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8t9b\" (UniqueName: \"kubernetes.io/projected/0b80b7b3-a0dc-488b-9431-2016284ab8af-kube-api-access-h8t9b\") pod \"nmstate-handler-h75fj\" (UID: \"0b80b7b3-a0dc-488b-9431-2016284ab8af\") " pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.526852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6b44\" (UniqueName: \"kubernetes.io/projected/02baa9c1-affa-4d66-afac-5c8bd20bf097-kube-api-access-p6b44\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.526966 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02baa9c1-affa-4d66-afac-5c8bd20bf097-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.527043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02baa9c1-affa-4d66-afac-5c8bd20bf097-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.527428 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.528448 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/02baa9c1-affa-4d66-afac-5c8bd20bf097-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.532746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/02baa9c1-affa-4d66-afac-5c8bd20bf097-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.556600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6b44\" (UniqueName: \"kubernetes.io/projected/02baa9c1-affa-4d66-afac-5c8bd20bf097-kube-api-access-p6b44\") pod \"nmstate-console-plugin-7fbb5f6569-4vlld\" (UID: \"02baa9c1-affa-4d66-afac-5c8bd20bf097\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.558168 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-55ff78d854-zrmvk"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.559450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.574702 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.580365 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55ff78d854-zrmvk"] Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.629058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-service-ca\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.629140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-oauth-config\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.629287 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-serving-cert\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.629439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-trusted-ca-bundle\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.629485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbrf4\" (UniqueName: \"kubernetes.io/projected/fc933ed4-df0a-428d-8c9b-97adf75c2d08-kube-api-access-nbrf4\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.629520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-oauth-serving-cert\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.629563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-config\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.666047 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.731635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-oauth-config\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.731709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-serving-cert\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.731759 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-trusted-ca-bundle\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.731790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbrf4\" (UniqueName: \"kubernetes.io/projected/fc933ed4-df0a-428d-8c9b-97adf75c2d08-kube-api-access-nbrf4\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.731814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-oauth-serving-cert\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.731843 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-config\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.731920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-service-ca\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.734232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-config\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.734484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-oauth-serving-cert\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.736743 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-trusted-ca-bundle\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.737466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc933ed4-df0a-428d-8c9b-97adf75c2d08-service-ca\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.742149 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-oauth-config\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.747797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc933ed4-df0a-428d-8c9b-97adf75c2d08-console-serving-cert\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.751117 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbrf4\" (UniqueName: \"kubernetes.io/projected/fc933ed4-df0a-428d-8c9b-97adf75c2d08-kube-api-access-nbrf4\") pod \"console-55ff78d854-zrmvk\" (UID: \"fc933ed4-df0a-428d-8c9b-97adf75c2d08\") " pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.815212 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj"] Nov 29 01:25:41 crc kubenswrapper[4749]: W1129 01:25:41.825536 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50eb0349_3cd4_44cf_93ad_76f0ed8ef9fe.slice/crio-8455b82e456a3d581af2e3f9601ebc002e6b9757391d09452aac120719dbe4ce WatchSource:0}: Error finding container 8455b82e456a3d581af2e3f9601ebc002e6b9757391d09452aac120719dbe4ce: Status 404 returned error can't find the container with id 8455b82e456a3d581af2e3f9601ebc002e6b9757391d09452aac120719dbe4ce Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.907965 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld"] Nov 29 01:25:41 crc kubenswrapper[4749]: W1129 01:25:41.912171 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02baa9c1_affa_4d66_afac_5c8bd20bf097.slice/crio-1f11b5cdaef81490ee8f2d6a635373aebf4e7de5d37ae97411909764a1883abf WatchSource:0}: Error finding container 1f11b5cdaef81490ee8f2d6a635373aebf4e7de5d37ae97411909764a1883abf: Status 404 returned error can't find the container with id 1f11b5cdaef81490ee8f2d6a635373aebf4e7de5d37ae97411909764a1883abf Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.914623 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.936128 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e66f7428-ba60-42fc-92b6-b45c1d974b8b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7jn4q\" (UID: \"e66f7428-ba60-42fc-92b6-b45c1d974b8b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:41 crc kubenswrapper[4749]: I1129 01:25:41.942940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e66f7428-ba60-42fc-92b6-b45c1d974b8b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-7jn4q\" (UID: \"e66f7428-ba60-42fc-92b6-b45c1d974b8b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:42 crc kubenswrapper[4749]: I1129 01:25:42.024538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" event={"ID":"50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe","Type":"ContainerStarted","Data":"8455b82e456a3d581af2e3f9601ebc002e6b9757391d09452aac120719dbe4ce"} Nov 29 01:25:42 crc kubenswrapper[4749]: I1129 01:25:42.026840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" event={"ID":"02baa9c1-affa-4d66-afac-5c8bd20bf097","Type":"ContainerStarted","Data":"1f11b5cdaef81490ee8f2d6a635373aebf4e7de5d37ae97411909764a1883abf"} Nov 29 01:25:42 crc kubenswrapper[4749]: I1129 01:25:42.028015 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h75fj" event={"ID":"0b80b7b3-a0dc-488b-9431-2016284ab8af","Type":"ContainerStarted","Data":"9ab016e0630670218a0f9a2e267f3d48c45cbb53a7427051a6a30484e33364e8"} Nov 29 01:25:42 crc kubenswrapper[4749]: I1129 01:25:42.159256 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:42 crc kubenswrapper[4749]: I1129 01:25:42.164529 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55ff78d854-zrmvk"] Nov 29 01:25:42 crc kubenswrapper[4749]: I1129 01:25:42.358808 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q"] Nov 29 01:25:42 crc kubenswrapper[4749]: W1129 01:25:42.364603 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode66f7428_ba60_42fc_92b6_b45c1d974b8b.slice/crio-275cedcfd9fa5eda6f03dc53505d878dadef59a8c08518a27f9364792ae26029 WatchSource:0}: Error finding container 275cedcfd9fa5eda6f03dc53505d878dadef59a8c08518a27f9364792ae26029: Status 404 returned error can't find the container with id 275cedcfd9fa5eda6f03dc53505d878dadef59a8c08518a27f9364792ae26029 Nov 29 01:25:43 crc kubenswrapper[4749]: I1129 01:25:43.047445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" event={"ID":"e66f7428-ba60-42fc-92b6-b45c1d974b8b","Type":"ContainerStarted","Data":"275cedcfd9fa5eda6f03dc53505d878dadef59a8c08518a27f9364792ae26029"} Nov 29 01:25:43 crc kubenswrapper[4749]: I1129 01:25:43.067631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ff78d854-zrmvk" event={"ID":"fc933ed4-df0a-428d-8c9b-97adf75c2d08","Type":"ContainerStarted","Data":"b823d46d373d49b4186e2f5cbe3ca63f0168ce88887bd0d084274eb545683372"} Nov 29 01:25:43 crc kubenswrapper[4749]: I1129 01:25:43.067722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55ff78d854-zrmvk" event={"ID":"fc933ed4-df0a-428d-8c9b-97adf75c2d08","Type":"ContainerStarted","Data":"fae1b9ce14e4446723d53d2db0136bb22ecf8d993a57b32a2ad8b84298b11a8c"} Nov 29 01:25:43 crc kubenswrapper[4749]: I1129 01:25:43.093655 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55ff78d854-zrmvk" podStartSLOduration=2.093622457 podStartE2EDuration="2.093622457s" podCreationTimestamp="2025-11-29 01:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:25:43.090118662 +0000 UTC m=+886.262268539" watchObservedRunningTime="2025-11-29 01:25:43.093622457 +0000 UTC m=+886.265772314" Nov 29 01:25:46 crc kubenswrapper[4749]: I1129 01:25:46.094138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-h75fj" event={"ID":"0b80b7b3-a0dc-488b-9431-2016284ab8af","Type":"ContainerStarted","Data":"57cf260ca3d173844967ba07d8cb3cdc2b8da1d333aaa6031be50dfb8331d66c"} Nov 29 01:25:46 crc kubenswrapper[4749]: I1129 01:25:46.095394 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:46 crc kubenswrapper[4749]: I1129 01:25:46.097417 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" event={"ID":"50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe","Type":"ContainerStarted","Data":"ca7a68ed3e389c188e8cc2283f27b5fbf03ac3bc7d50eec55aaf4eae1c81f34a"} Nov 29 01:25:46 crc kubenswrapper[4749]: I1129 01:25:46.099875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" event={"ID":"e66f7428-ba60-42fc-92b6-b45c1d974b8b","Type":"ContainerStarted","Data":"89380e00dbd9a4d5bbaca25061a9475efe5e55293ac7c5481d043f4a7a76356a"} Nov 29 01:25:46 crc kubenswrapper[4749]: I1129 01:25:46.100007 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:25:46 crc kubenswrapper[4749]: I1129 01:25:46.102966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" event={"ID":"02baa9c1-affa-4d66-afac-5c8bd20bf097","Type":"ContainerStarted","Data":"1dce79b10bbbc121ac8ccbd1f22be636ddbd052724b1d4297380f9790f2b1468"} Nov 29 01:25:46 crc kubenswrapper[4749]: I1129 01:25:46.126167 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-h75fj" podStartSLOduration=1.70970574 podStartE2EDuration="5.126130595s" podCreationTimestamp="2025-11-29 01:25:41 +0000 UTC" firstStartedPulling="2025-11-29 01:25:41.633107064 +0000 UTC m=+884.805256921" lastFinishedPulling="2025-11-29 01:25:45.049531879 +0000 UTC m=+888.221681776" observedRunningTime="2025-11-29 01:25:46.117039183 +0000 UTC m=+889.289189120" watchObservedRunningTime="2025-11-29 01:25:46.126130595 +0000 UTC m=+889.298280482" Nov 29 01:25:46 crc kubenswrapper[4749]: I1129 01:25:46.150907 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" podStartSLOduration=2.4652763650000002 podStartE2EDuration="5.150866659s" podCreationTimestamp="2025-11-29 01:25:41 +0000 UTC" firstStartedPulling="2025-11-29 01:25:42.366891587 +0000 UTC m=+885.539041454" lastFinishedPulling="2025-11-29 01:25:45.052481851 +0000 UTC m=+888.224631748" observedRunningTime="2025-11-29 01:25:46.146296367 +0000 UTC m=+889.318446244" watchObservedRunningTime="2025-11-29 01:25:46.150866659 +0000 UTC m=+889.323016556" Nov 29 01:25:47 crc kubenswrapper[4749]: I1129 01:25:47.099463 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4vlld" podStartSLOduration=2.974286587 podStartE2EDuration="6.099433377s" podCreationTimestamp="2025-11-29 01:25:41 +0000 UTC" firstStartedPulling="2025-11-29 01:25:41.914672302 +0000 UTC m=+885.086822159" lastFinishedPulling="2025-11-29 01:25:45.039819042 +0000 UTC m=+888.211968949" observedRunningTime="2025-11-29 01:25:46.172841985 +0000 UTC m=+889.344991852" watchObservedRunningTime="2025-11-29 01:25:47.099433377 +0000 UTC m=+890.271583234" Nov 29 01:25:48 crc kubenswrapper[4749]: I1129 01:25:48.125365 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" event={"ID":"50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe","Type":"ContainerStarted","Data":"56f6126134e6f1476404e01e46cf2018937e320aa1a2bea0c6c9d0f5f038cbf6"} Nov 29 01:25:48 crc kubenswrapper[4749]: I1129 01:25:48.152080 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qsmwj" podStartSLOduration=1.315763397 podStartE2EDuration="7.152063326s" podCreationTimestamp="2025-11-29 01:25:41 +0000 UTC" firstStartedPulling="2025-11-29 01:25:41.828492767 +0000 UTC m=+885.000642624" lastFinishedPulling="2025-11-29 01:25:47.664792696 +0000 UTC m=+890.836942553" observedRunningTime="2025-11-29 01:25:48.151455332 +0000 UTC m=+891.323605199" watchObservedRunningTime="2025-11-29 01:25:48.152063326 +0000 UTC m=+891.324213183" Nov 29 01:25:51 crc kubenswrapper[4749]: I1129 01:25:51.620299 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-h75fj" Nov 29 01:25:51 crc kubenswrapper[4749]: I1129 01:25:51.915616 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:51 crc kubenswrapper[4749]: I1129 01:25:51.915707 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:51 crc kubenswrapper[4749]: I1129 01:25:51.924158 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:52 crc kubenswrapper[4749]: I1129 01:25:52.167495 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55ff78d854-zrmvk" Nov 29 01:25:52 crc kubenswrapper[4749]: I1129 01:25:52.262443 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-klx5g"] Nov 29 01:25:55 crc kubenswrapper[4749]: I1129 01:25:55.375124 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:25:55 crc kubenswrapper[4749]: I1129 01:25:55.375309 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:26:02 crc kubenswrapper[4749]: I1129 01:26:02.170633 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-7jn4q" Nov 29 01:26:17 crc kubenswrapper[4749]: I1129 01:26:17.331262 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-klx5g" podUID="efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" containerName="console" containerID="cri-o://9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f" gracePeriod=15 Nov 29 01:26:17 crc kubenswrapper[4749]: I1129 01:26:17.836444 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-klx5g_efe19f2e-05c9-4ebb-a96a-9bdb02cebba8/console/0.log" Nov 29 01:26:17 crc kubenswrapper[4749]: I1129 01:26:17.836549 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.002228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-service-ca\") pod \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.002716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdd6q\" (UniqueName: \"kubernetes.io/projected/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-kube-api-access-zdd6q\") pod \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.002767 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-serving-cert\") pod \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.002791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-trusted-ca-bundle\") pod \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.002880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-oauth-config\") pod \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.002916 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-config\") pod \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.002942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-oauth-serving-cert\") pod \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\" (UID: \"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8\") " Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.003648 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" (UID: "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.003744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" (UID: "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.003868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-service-ca" (OuterVolumeSpecName: "service-ca") pod "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" (UID: "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.004254 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-config" (OuterVolumeSpecName: "console-config") pod "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" (UID: "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.011857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" (UID: "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.014321 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" (UID: "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.017412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-kube-api-access-zdd6q" (OuterVolumeSpecName: "kube-api-access-zdd6q") pod "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" (UID: "efe19f2e-05c9-4ebb-a96a-9bdb02cebba8"). InnerVolumeSpecName "kube-api-access-zdd6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.104570 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.104620 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.104633 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.104642 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-console-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.104654 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.104663 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-service-ca\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.104672 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdd6q\" (UniqueName: \"kubernetes.io/projected/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8-kube-api-access-zdd6q\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.381634 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-klx5g_efe19f2e-05c9-4ebb-a96a-9bdb02cebba8/console/0.log" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.381701 4749 generic.go:334] "Generic (PLEG): container finished" podID="efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" containerID="9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f" exitCode=2 Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.381743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-klx5g" event={"ID":"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8","Type":"ContainerDied","Data":"9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f"} Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.381790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-klx5g" event={"ID":"efe19f2e-05c9-4ebb-a96a-9bdb02cebba8","Type":"ContainerDied","Data":"f5b06bd016653980d82de75e75738fbac6a9b4c9be63792339ad8bcbb98cd339"} Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.381811 4749 scope.go:117] "RemoveContainer" containerID="9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.381876 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-klx5g" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.410849 4749 scope.go:117] "RemoveContainer" containerID="9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f" Nov 29 01:26:18 crc kubenswrapper[4749]: E1129 01:26:18.411772 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f\": container with ID starting with 9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f not found: ID does not exist" containerID="9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.411839 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f"} err="failed to get container status \"9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f\": rpc error: code = NotFound desc = could not find container \"9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f\": container with ID starting with 9935a0a2067c7d310b418e4545c5c30ab63b918ddd3837303e92045368ab3a9f not found: ID does not exist" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.424797 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-klx5g"] Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.440020 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-klx5g"] Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.878649 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c"] Nov 29 01:26:18 crc kubenswrapper[4749]: E1129 01:26:18.879106 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" containerName="console" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.879140 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" containerName="console" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.879641 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" containerName="console" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.881441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.885820 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.888999 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c"] Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.930645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.930739 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnc69\" (UniqueName: \"kubernetes.io/projected/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-kube-api-access-tnc69\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:18 crc kubenswrapper[4749]: I1129 01:26:18.930827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.031906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.031993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnc69\" (UniqueName: \"kubernetes.io/projected/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-kube-api-access-tnc69\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.032104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.032679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.034079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.055077 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnc69\" (UniqueName: \"kubernetes.io/projected/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-kube-api-access-tnc69\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.084302 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efe19f2e-05c9-4ebb-a96a-9bdb02cebba8" path="/var/lib/kubelet/pods/efe19f2e-05c9-4ebb-a96a-9bdb02cebba8/volumes" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.254809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:19 crc kubenswrapper[4749]: I1129 01:26:19.562024 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c"] Nov 29 01:26:20 crc kubenswrapper[4749]: I1129 01:26:20.404347 4749 generic.go:334] "Generic (PLEG): container finished" podID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerID="fbaf1540c82a1cbf724148cb013748a7ccd9f91fd6d95757ed57b0d53aa70621" exitCode=0 Nov 29 01:26:20 crc kubenswrapper[4749]: I1129 01:26:20.404408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" event={"ID":"d8ea565f-9a8a-48f1-aba9-d603fcf591c4","Type":"ContainerDied","Data":"fbaf1540c82a1cbf724148cb013748a7ccd9f91fd6d95757ed57b0d53aa70621"} Nov 29 01:26:20 crc kubenswrapper[4749]: I1129 01:26:20.404445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" event={"ID":"d8ea565f-9a8a-48f1-aba9-d603fcf591c4","Type":"ContainerStarted","Data":"d174ef3bf7daec655fa8ed125c1f4d99823e2a53ec0503cd55010a9b9a4bbf18"} Nov 29 01:26:22 crc kubenswrapper[4749]: I1129 01:26:22.425347 4749 generic.go:334] "Generic (PLEG): container finished" podID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerID="d255ba4209605e94123f6542e8deeb18edd53a9a09f4c2877cbce1ff6d8d8a1f" exitCode=0 Nov 29 01:26:22 crc kubenswrapper[4749]: I1129 01:26:22.425476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" event={"ID":"d8ea565f-9a8a-48f1-aba9-d603fcf591c4","Type":"ContainerDied","Data":"d255ba4209605e94123f6542e8deeb18edd53a9a09f4c2877cbce1ff6d8d8a1f"} Nov 29 01:26:23 crc kubenswrapper[4749]: I1129 01:26:23.437939 4749 generic.go:334] "Generic (PLEG): container finished" podID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerID="205ab83763c81f460763cd0f04142be693b3521c2564bfc7c99fa35e4b82fccf" exitCode=0 Nov 29 01:26:23 crc kubenswrapper[4749]: I1129 01:26:23.438060 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" event={"ID":"d8ea565f-9a8a-48f1-aba9-d603fcf591c4","Type":"ContainerDied","Data":"205ab83763c81f460763cd0f04142be693b3521c2564bfc7c99fa35e4b82fccf"} Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.713351 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.827102 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-util\") pod \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.827188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnc69\" (UniqueName: \"kubernetes.io/projected/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-kube-api-access-tnc69\") pod \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.827238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-bundle\") pod \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\" (UID: \"d8ea565f-9a8a-48f1-aba9-d603fcf591c4\") " Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.829424 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-bundle" (OuterVolumeSpecName: "bundle") pod "d8ea565f-9a8a-48f1-aba9-d603fcf591c4" (UID: "d8ea565f-9a8a-48f1-aba9-d603fcf591c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.834384 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-kube-api-access-tnc69" (OuterVolumeSpecName: "kube-api-access-tnc69") pod "d8ea565f-9a8a-48f1-aba9-d603fcf591c4" (UID: "d8ea565f-9a8a-48f1-aba9-d603fcf591c4"). InnerVolumeSpecName "kube-api-access-tnc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.840892 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-util" (OuterVolumeSpecName: "util") pod "d8ea565f-9a8a-48f1-aba9-d603fcf591c4" (UID: "d8ea565f-9a8a-48f1-aba9-d603fcf591c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.928876 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-util\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.928930 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnc69\" (UniqueName: \"kubernetes.io/projected/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-kube-api-access-tnc69\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:24 crc kubenswrapper[4749]: I1129 01:26:24.928949 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d8ea565f-9a8a-48f1-aba9-d603fcf591c4-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:26:25 crc kubenswrapper[4749]: I1129 01:26:25.374017 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:26:25 crc kubenswrapper[4749]: I1129 01:26:25.374101 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:26:25 crc kubenswrapper[4749]: I1129 01:26:25.457435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" event={"ID":"d8ea565f-9a8a-48f1-aba9-d603fcf591c4","Type":"ContainerDied","Data":"d174ef3bf7daec655fa8ed125c1f4d99823e2a53ec0503cd55010a9b9a4bbf18"} Nov 29 01:26:25 crc kubenswrapper[4749]: I1129 01:26:25.457522 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c" Nov 29 01:26:25 crc kubenswrapper[4749]: I1129 01:26:25.457521 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d174ef3bf7daec655fa8ed125c1f4d99823e2a53ec0503cd55010a9b9a4bbf18" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.328084 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b479d995c-252wh"] Nov 29 01:26:34 crc kubenswrapper[4749]: E1129 01:26:34.329252 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerName="pull" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.329270 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerName="pull" Nov 29 01:26:34 crc kubenswrapper[4749]: E1129 01:26:34.329282 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerName="util" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.329289 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerName="util" Nov 29 01:26:34 crc kubenswrapper[4749]: E1129 01:26:34.329299 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerName="extract" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.329306 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerName="extract" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.329423 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ea565f-9a8a-48f1-aba9-d603fcf591c4" containerName="extract" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.329958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.332827 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.333608 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.334100 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.334493 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xnrz8" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.335013 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.348835 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b479d995c-252wh"] Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.403756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef286966-9492-4971-a5a1-072fd0de42e6-apiservice-cert\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.403833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgvp5\" (UniqueName: \"kubernetes.io/projected/ef286966-9492-4971-a5a1-072fd0de42e6-kube-api-access-rgvp5\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.403874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef286966-9492-4971-a5a1-072fd0de42e6-webhook-cert\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.504334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgvp5\" (UniqueName: \"kubernetes.io/projected/ef286966-9492-4971-a5a1-072fd0de42e6-kube-api-access-rgvp5\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.504415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef286966-9492-4971-a5a1-072fd0de42e6-webhook-cert\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.504461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef286966-9492-4971-a5a1-072fd0de42e6-apiservice-cert\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.512663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef286966-9492-4971-a5a1-072fd0de42e6-webhook-cert\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.527302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgvp5\" (UniqueName: \"kubernetes.io/projected/ef286966-9492-4971-a5a1-072fd0de42e6-kube-api-access-rgvp5\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.529026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ef286966-9492-4971-a5a1-072fd0de42e6-apiservice-cert\") pod \"metallb-operator-controller-manager-5b479d995c-252wh\" (UID: \"ef286966-9492-4971-a5a1-072fd0de42e6\") " pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.650233 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.709162 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-68569fcffb-jj942"] Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.710093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.714936 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-p4bzr" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.716565 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.716580 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.736253 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-68569fcffb-jj942"] Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.807909 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht5sw\" (UniqueName: \"kubernetes.io/projected/7240a7b6-7fc6-4960-8db0-55e75820bd36-kube-api-access-ht5sw\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.807997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7240a7b6-7fc6-4960-8db0-55e75820bd36-webhook-cert\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.808062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7240a7b6-7fc6-4960-8db0-55e75820bd36-apiservice-cert\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.909363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7240a7b6-7fc6-4960-8db0-55e75820bd36-apiservice-cert\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.909453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht5sw\" (UniqueName: \"kubernetes.io/projected/7240a7b6-7fc6-4960-8db0-55e75820bd36-kube-api-access-ht5sw\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.909494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7240a7b6-7fc6-4960-8db0-55e75820bd36-webhook-cert\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.923642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7240a7b6-7fc6-4960-8db0-55e75820bd36-apiservice-cert\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.926451 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7240a7b6-7fc6-4960-8db0-55e75820bd36-webhook-cert\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:34 crc kubenswrapper[4749]: I1129 01:26:34.954225 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht5sw\" (UniqueName: \"kubernetes.io/projected/7240a7b6-7fc6-4960-8db0-55e75820bd36-kube-api-access-ht5sw\") pod \"metallb-operator-webhook-server-68569fcffb-jj942\" (UID: \"7240a7b6-7fc6-4960-8db0-55e75820bd36\") " pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:35 crc kubenswrapper[4749]: I1129 01:26:35.024912 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:35 crc kubenswrapper[4749]: I1129 01:26:35.086544 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b479d995c-252wh"] Nov 29 01:26:35 crc kubenswrapper[4749]: I1129 01:26:35.279119 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-68569fcffb-jj942"] Nov 29 01:26:35 crc kubenswrapper[4749]: W1129 01:26:35.299099 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7240a7b6_7fc6_4960_8db0_55e75820bd36.slice/crio-ba1261dc18f0e6e46531f3bdeed9e398d5290bb06e3ef3dc08a33d06c805b604 WatchSource:0}: Error finding container ba1261dc18f0e6e46531f3bdeed9e398d5290bb06e3ef3dc08a33d06c805b604: Status 404 returned error can't find the container with id ba1261dc18f0e6e46531f3bdeed9e398d5290bb06e3ef3dc08a33d06c805b604 Nov 29 01:26:35 crc kubenswrapper[4749]: I1129 01:26:35.538618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" event={"ID":"ef286966-9492-4971-a5a1-072fd0de42e6","Type":"ContainerStarted","Data":"70c927890a681252c98a1425bc8b2fd4422a8d9a3f8129bfa327f0c4a9181a3c"} Nov 29 01:26:35 crc kubenswrapper[4749]: I1129 01:26:35.540809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" event={"ID":"7240a7b6-7fc6-4960-8db0-55e75820bd36","Type":"ContainerStarted","Data":"ba1261dc18f0e6e46531f3bdeed9e398d5290bb06e3ef3dc08a33d06c805b604"} Nov 29 01:26:40 crc kubenswrapper[4749]: I1129 01:26:40.599953 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" event={"ID":"ef286966-9492-4971-a5a1-072fd0de42e6","Type":"ContainerStarted","Data":"72d692c4922695ec6d073bff159e145108a443b9d3d021420cbec18fb27d6a53"} Nov 29 01:26:40 crc kubenswrapper[4749]: I1129 01:26:40.601190 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:26:40 crc kubenswrapper[4749]: I1129 01:26:40.602514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" event={"ID":"7240a7b6-7fc6-4960-8db0-55e75820bd36","Type":"ContainerStarted","Data":"0b2ceee38960c7eafc01c366f84376c00fdedd600e566cac03ae7263bf25ad28"} Nov 29 01:26:40 crc kubenswrapper[4749]: I1129 01:26:40.602795 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:40 crc kubenswrapper[4749]: I1129 01:26:40.624428 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" podStartSLOduration=1.7341099679999998 podStartE2EDuration="6.624381588s" podCreationTimestamp="2025-11-29 01:26:34 +0000 UTC" firstStartedPulling="2025-11-29 01:26:35.106910496 +0000 UTC m=+938.279060353" lastFinishedPulling="2025-11-29 01:26:39.997182116 +0000 UTC m=+943.169331973" observedRunningTime="2025-11-29 01:26:40.620342777 +0000 UTC m=+943.792492644" watchObservedRunningTime="2025-11-29 01:26:40.624381588 +0000 UTC m=+943.796531495" Nov 29 01:26:40 crc kubenswrapper[4749]: I1129 01:26:40.662421 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" podStartSLOduration=1.944963078 podStartE2EDuration="6.662388528s" podCreationTimestamp="2025-11-29 01:26:34 +0000 UTC" firstStartedPulling="2025-11-29 01:26:35.303311494 +0000 UTC m=+938.475461351" lastFinishedPulling="2025-11-29 01:26:40.020736944 +0000 UTC m=+943.192886801" observedRunningTime="2025-11-29 01:26:40.661500776 +0000 UTC m=+943.833650663" watchObservedRunningTime="2025-11-29 01:26:40.662388528 +0000 UTC m=+943.834538395" Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.032497 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-68569fcffb-jj942" Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.374908 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.375024 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.375112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.376455 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9302b61a72148487837a4aeb2ccc5c42240573bc5890594b41af31b7f42617b2"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.376609 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://9302b61a72148487837a4aeb2ccc5c42240573bc5890594b41af31b7f42617b2" gracePeriod=600 Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.716025 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="9302b61a72148487837a4aeb2ccc5c42240573bc5890594b41af31b7f42617b2" exitCode=0 Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.716093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"9302b61a72148487837a4aeb2ccc5c42240573bc5890594b41af31b7f42617b2"} Nov 29 01:26:55 crc kubenswrapper[4749]: I1129 01:26:55.716605 4749 scope.go:117] "RemoveContainer" containerID="62034ef1cdc0575fc9032dd2650147700f908c8a4b290d41887f55a1ffc76581" Nov 29 01:26:57 crc kubenswrapper[4749]: I1129 01:26:57.738645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"6e6c7dec3b1649e653ef737530df27b983a2221104d91371e3560585b54c93a8"} Nov 29 01:27:14 crc kubenswrapper[4749]: I1129 01:27:14.654613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5b479d995c-252wh" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.555363 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rb7rz"] Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.558423 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.565033 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.565173 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-r7xxd" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.567355 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.587687 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m"] Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.588704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.590856 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.602360 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m"] Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.691859 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-tdt6d"] Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.693225 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.695513 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.695733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.696064 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-47dc8" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.696632 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-startup\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf49ecc1-5b2d-4b0b-a06e-0193e60947cd-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kkp2m\" (UID: \"cf49ecc1-5b2d-4b0b-a06e-0193e60947cd\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-conf\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-metrics\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704320 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-metrics-certs\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-sockets\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704370 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgpkb\" (UniqueName: \"kubernetes.io/projected/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-kube-api-access-tgpkb\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvlq2\" (UniqueName: \"kubernetes.io/projected/cf49ecc1-5b2d-4b0b-a06e-0193e60947cd-kube-api-access-tvlq2\") pod \"frr-k8s-webhook-server-7fcb986d4-kkp2m\" (UID: \"cf49ecc1-5b2d-4b0b-a06e-0193e60947cd\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.704415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-reloader\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.736275 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-5z7ln"] Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.737640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.739756 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.755312 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-5z7ln"] Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.806505 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psg98\" (UniqueName: \"kubernetes.io/projected/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-kube-api-access-psg98\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.806575 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgpkb\" (UniqueName: \"kubernetes.io/projected/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-kube-api-access-tgpkb\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.806609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlq2\" (UniqueName: \"kubernetes.io/projected/cf49ecc1-5b2d-4b0b-a06e-0193e60947cd-kube-api-access-tvlq2\") pod \"frr-k8s-webhook-server-7fcb986d4-kkp2m\" (UID: \"cf49ecc1-5b2d-4b0b-a06e-0193e60947cd\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.806756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-metallb-excludel2\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.806846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-reloader\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.806953 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqslg\" (UniqueName: \"kubernetes.io/projected/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-kube-api-access-pqslg\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-startup\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-metrics-certs\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf49ecc1-5b2d-4b0b-a06e-0193e60947cd-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kkp2m\" (UID: \"cf49ecc1-5b2d-4b0b-a06e-0193e60947cd\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-conf\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-metrics\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-metrics-certs\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-metrics-certs\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-cert\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-sockets\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-reloader\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-conf\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807852 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-metrics\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.807916 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-sockets\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.808367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-frr-startup\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.814705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cf49ecc1-5b2d-4b0b-a06e-0193e60947cd-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kkp2m\" (UID: \"cf49ecc1-5b2d-4b0b-a06e-0193e60947cd\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.815607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-metrics-certs\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.822891 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvlq2\" (UniqueName: \"kubernetes.io/projected/cf49ecc1-5b2d-4b0b-a06e-0193e60947cd-kube-api-access-tvlq2\") pod \"frr-k8s-webhook-server-7fcb986d4-kkp2m\" (UID: \"cf49ecc1-5b2d-4b0b-a06e-0193e60947cd\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.835889 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgpkb\" (UniqueName: \"kubernetes.io/projected/6937d960-a5b2-45b6-99cb-1f6ed6e0563a-kube-api-access-tgpkb\") pod \"frr-k8s-rb7rz\" (UID: \"6937d960-a5b2-45b6-99cb-1f6ed6e0563a\") " pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.880295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.905245 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.909171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqslg\" (UniqueName: \"kubernetes.io/projected/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-kube-api-access-pqslg\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.909269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-metrics-certs\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.909313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-metrics-certs\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.909330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-cert\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.909351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.909373 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psg98\" (UniqueName: \"kubernetes.io/projected/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-kube-api-access-psg98\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.909404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-metallb-excludel2\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: E1129 01:27:15.910054 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.910282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-metallb-excludel2\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: E1129 01:27:15.910315 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist podName:1bd292ff-8eb0-4ed0-95cd-6ba367873d7a nodeName:}" failed. No retries permitted until 2025-11-29 01:27:16.410155259 +0000 UTC m=+979.582305116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist") pod "speaker-tdt6d" (UID: "1bd292ff-8eb0-4ed0-95cd-6ba367873d7a") : secret "metallb-memberlist" not found Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.912434 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.914484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-metrics-certs\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.914554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-metrics-certs\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.929605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqslg\" (UniqueName: \"kubernetes.io/projected/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-kube-api-access-pqslg\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.931671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-cert\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:15 crc kubenswrapper[4749]: I1129 01:27:15.932938 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psg98\" (UniqueName: \"kubernetes.io/projected/ba1f212b-3cc2-4f6e-9b71-443f17d0e113-kube-api-access-psg98\") pod \"controller-f8648f98b-5z7ln\" (UID: \"ba1f212b-3cc2-4f6e-9b71-443f17d0e113\") " pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.053863 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.267444 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-5z7ln"] Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.413439 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m"] Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.424126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:16 crc kubenswrapper[4749]: E1129 01:27:16.424396 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 29 01:27:16 crc kubenswrapper[4749]: E1129 01:27:16.424465 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist podName:1bd292ff-8eb0-4ed0-95cd-6ba367873d7a nodeName:}" failed. No retries permitted until 2025-11-29 01:27:17.424441667 +0000 UTC m=+980.596591524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist") pod "speaker-tdt6d" (UID: "1bd292ff-8eb0-4ed0-95cd-6ba367873d7a") : secret "metallb-memberlist" not found Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.916206 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5z7ln" event={"ID":"ba1f212b-3cc2-4f6e-9b71-443f17d0e113","Type":"ContainerStarted","Data":"3adb0dcfbb8196be86887c0bd2692584e80f1d44c24765fdab6f7721dcb13f46"} Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.916271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5z7ln" event={"ID":"ba1f212b-3cc2-4f6e-9b71-443f17d0e113","Type":"ContainerStarted","Data":"74a429816023362ff942ec31a63a7b0e55bacc7fd18791989554c9f4d562c051"} Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.916287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5z7ln" event={"ID":"ba1f212b-3cc2-4f6e-9b71-443f17d0e113","Type":"ContainerStarted","Data":"c64699fb2e72c919cd9ab543955801d600bdfbb6b8268521213daf745145155a"} Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.916386 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.917578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" event={"ID":"cf49ecc1-5b2d-4b0b-a06e-0193e60947cd","Type":"ContainerStarted","Data":"0e7db4cfb8a7b4d7fd336af5bc87bc13d58bc48d061e3e5bad85217948362b4a"} Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.918504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerStarted","Data":"90ddd7cec523729e4cb41d53a83aa9956a1e97494a9d8e201f7c07e5a1d75f28"} Nov 29 01:27:16 crc kubenswrapper[4749]: I1129 01:27:16.943896 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-5z7ln" podStartSLOduration=1.9438700230000001 podStartE2EDuration="1.943870023s" podCreationTimestamp="2025-11-29 01:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:27:16.938800246 +0000 UTC m=+980.110950103" watchObservedRunningTime="2025-11-29 01:27:16.943870023 +0000 UTC m=+980.116019880" Nov 29 01:27:17 crc kubenswrapper[4749]: I1129 01:27:17.441837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:17 crc kubenswrapper[4749]: I1129 01:27:17.452960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/1bd292ff-8eb0-4ed0-95cd-6ba367873d7a-memberlist\") pod \"speaker-tdt6d\" (UID: \"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a\") " pod="metallb-system/speaker-tdt6d" Nov 29 01:27:17 crc kubenswrapper[4749]: I1129 01:27:17.513255 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-tdt6d" Nov 29 01:27:17 crc kubenswrapper[4749]: I1129 01:27:17.934882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tdt6d" event={"ID":"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a","Type":"ContainerStarted","Data":"075e1f3fca08a8bc70587ca3b5157a77e1ce053d3b64ea82977a846f2288c888"} Nov 29 01:27:18 crc kubenswrapper[4749]: I1129 01:27:18.951772 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tdt6d" event={"ID":"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a","Type":"ContainerStarted","Data":"8b1dbea57e44e1d0ccac66d4bccead5f9863378a5add7641a61dacb093789e30"} Nov 29 01:27:18 crc kubenswrapper[4749]: I1129 01:27:18.952209 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-tdt6d" event={"ID":"1bd292ff-8eb0-4ed0-95cd-6ba367873d7a","Type":"ContainerStarted","Data":"baca5e14f2721ef16fb2c1bad739bb65e3171887d36b5c28dadb94f5799a29d3"} Nov 29 01:27:18 crc kubenswrapper[4749]: I1129 01:27:18.952258 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-tdt6d" Nov 29 01:27:18 crc kubenswrapper[4749]: I1129 01:27:18.985698 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-tdt6d" podStartSLOduration=3.98566898 podStartE2EDuration="3.98566898s" podCreationTimestamp="2025-11-29 01:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:27:18.980733597 +0000 UTC m=+982.152883454" watchObservedRunningTime="2025-11-29 01:27:18.98566898 +0000 UTC m=+982.157818837" Nov 29 01:27:24 crc kubenswrapper[4749]: I1129 01:27:24.026062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" event={"ID":"cf49ecc1-5b2d-4b0b-a06e-0193e60947cd","Type":"ContainerStarted","Data":"fd9c25ce19b1b871d0c5cd3d6850747baa47170704e52f36206b60b07541f239"} Nov 29 01:27:24 crc kubenswrapper[4749]: I1129 01:27:24.027027 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:24 crc kubenswrapper[4749]: I1129 01:27:24.032029 4749 generic.go:334] "Generic (PLEG): container finished" podID="6937d960-a5b2-45b6-99cb-1f6ed6e0563a" containerID="47245410d101ee09e7066ab552399ce97c2b1ed322d007cd112c602f5e56c03a" exitCode=0 Nov 29 01:27:24 crc kubenswrapper[4749]: I1129 01:27:24.032087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerDied","Data":"47245410d101ee09e7066ab552399ce97c2b1ed322d007cd112c602f5e56c03a"} Nov 29 01:27:24 crc kubenswrapper[4749]: I1129 01:27:24.053656 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" podStartSLOduration=1.961597255 podStartE2EDuration="9.053617097s" podCreationTimestamp="2025-11-29 01:27:15 +0000 UTC" firstStartedPulling="2025-11-29 01:27:16.433634496 +0000 UTC m=+979.605784393" lastFinishedPulling="2025-11-29 01:27:23.525654378 +0000 UTC m=+986.697804235" observedRunningTime="2025-11-29 01:27:24.047993497 +0000 UTC m=+987.220143404" watchObservedRunningTime="2025-11-29 01:27:24.053617097 +0000 UTC m=+987.225767014" Nov 29 01:27:25 crc kubenswrapper[4749]: I1129 01:27:25.044101 4749 generic.go:334] "Generic (PLEG): container finished" podID="6937d960-a5b2-45b6-99cb-1f6ed6e0563a" containerID="24405d69f56f367ee449641b21afd829ca35b95a7bb7bf18ff85ccb3c1ca359f" exitCode=0 Nov 29 01:27:25 crc kubenswrapper[4749]: I1129 01:27:25.044189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerDied","Data":"24405d69f56f367ee449641b21afd829ca35b95a7bb7bf18ff85ccb3c1ca359f"} Nov 29 01:27:26 crc kubenswrapper[4749]: I1129 01:27:26.056753 4749 generic.go:334] "Generic (PLEG): container finished" podID="6937d960-a5b2-45b6-99cb-1f6ed6e0563a" containerID="704c1739b1ec6fb541bc3ad55fee3fdf7416d3545c5931a15712369c9822a324" exitCode=0 Nov 29 01:27:26 crc kubenswrapper[4749]: I1129 01:27:26.056830 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerDied","Data":"704c1739b1ec6fb541bc3ad55fee3fdf7416d3545c5931a15712369c9822a324"} Nov 29 01:27:26 crc kubenswrapper[4749]: I1129 01:27:26.067102 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-5z7ln" Nov 29 01:27:27 crc kubenswrapper[4749]: I1129 01:27:27.072234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerStarted","Data":"649c4354b66bbe81daca493c3fedd39ca3730e44327bfb160574fa655a04a074"} Nov 29 01:27:27 crc kubenswrapper[4749]: I1129 01:27:27.072928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerStarted","Data":"4420134cb9d3e7128184b96f14d61ffc086e95fbd0bf2f10ca7e8925ca8bcd98"} Nov 29 01:27:27 crc kubenswrapper[4749]: I1129 01:27:27.072965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerStarted","Data":"141591cd4499a1515eca914e263bfb8cb2cd93d38c351ce6aa748f511fd5a77c"} Nov 29 01:27:27 crc kubenswrapper[4749]: I1129 01:27:27.072991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerStarted","Data":"9ad01557c3439ae721af877214ace52d63a52dbb486965a2bc410c1bf3a078cf"} Nov 29 01:27:27 crc kubenswrapper[4749]: I1129 01:27:27.520873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-tdt6d" Nov 29 01:27:28 crc kubenswrapper[4749]: I1129 01:27:28.090956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerStarted","Data":"18ae95ec3e97fffcc530972658a6fe541de279466a7fe8756672641c99d2adf8"} Nov 29 01:27:28 crc kubenswrapper[4749]: I1129 01:27:28.091041 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rb7rz" event={"ID":"6937d960-a5b2-45b6-99cb-1f6ed6e0563a","Type":"ContainerStarted","Data":"c80f8191cd08bf4ffbf1c10565aecf9581eba108e42d7ed4e13117ad6eaec92d"} Nov 29 01:27:28 crc kubenswrapper[4749]: I1129 01:27:28.091265 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:28 crc kubenswrapper[4749]: I1129 01:27:28.131363 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rb7rz" podStartSLOduration=5.73171624 podStartE2EDuration="13.131333905s" podCreationTimestamp="2025-11-29 01:27:15 +0000 UTC" firstStartedPulling="2025-11-29 01:27:16.129006028 +0000 UTC m=+979.301155885" lastFinishedPulling="2025-11-29 01:27:23.528623683 +0000 UTC m=+986.700773550" observedRunningTime="2025-11-29 01:27:28.125875549 +0000 UTC m=+991.298025486" watchObservedRunningTime="2025-11-29 01:27:28.131333905 +0000 UTC m=+991.303483772" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.183758 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7"] Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.186430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.188733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.203386 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7"] Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.280409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.280508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxhks\" (UniqueName: \"kubernetes.io/projected/54610f30-edc2-4aed-b76f-ba1ce5451386-kube-api-access-hxhks\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.280568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.381841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxhks\" (UniqueName: \"kubernetes.io/projected/54610f30-edc2-4aed-b76f-ba1ce5451386-kube-api-access-hxhks\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.381969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.382094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.383059 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.383088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.413647 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxhks\" (UniqueName: \"kubernetes.io/projected/54610f30-edc2-4aed-b76f-ba1ce5451386-kube-api-access-hxhks\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.519162 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:29 crc kubenswrapper[4749]: I1129 01:27:29.812327 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7"] Nov 29 01:27:29 crc kubenswrapper[4749]: W1129 01:27:29.813550 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54610f30_edc2_4aed_b76f_ba1ce5451386.slice/crio-1a6d2510afcd6ecd7ce7020fa37ed5292db1e7d2779d627ca8e363302575cf8c WatchSource:0}: Error finding container 1a6d2510afcd6ecd7ce7020fa37ed5292db1e7d2779d627ca8e363302575cf8c: Status 404 returned error can't find the container with id 1a6d2510afcd6ecd7ce7020fa37ed5292db1e7d2779d627ca8e363302575cf8c Nov 29 01:27:30 crc kubenswrapper[4749]: I1129 01:27:30.116429 4749 generic.go:334] "Generic (PLEG): container finished" podID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerID="ec9dca82d9504a6c3a7267517089621ba23442abcee1f46dc92115867d0ee520" exitCode=0 Nov 29 01:27:30 crc kubenswrapper[4749]: I1129 01:27:30.116508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" event={"ID":"54610f30-edc2-4aed-b76f-ba1ce5451386","Type":"ContainerDied","Data":"ec9dca82d9504a6c3a7267517089621ba23442abcee1f46dc92115867d0ee520"} Nov 29 01:27:30 crc kubenswrapper[4749]: I1129 01:27:30.116561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" event={"ID":"54610f30-edc2-4aed-b76f-ba1ce5451386","Type":"ContainerStarted","Data":"1a6d2510afcd6ecd7ce7020fa37ed5292db1e7d2779d627ca8e363302575cf8c"} Nov 29 01:27:30 crc kubenswrapper[4749]: I1129 01:27:30.881515 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:30 crc kubenswrapper[4749]: I1129 01:27:30.936873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:34 crc kubenswrapper[4749]: I1129 01:27:34.178570 4749 generic.go:334] "Generic (PLEG): container finished" podID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerID="8c81c24ce588b039d2c00bb4ab26d14dbcea62b38933604d0392dcd8a55b73d3" exitCode=0 Nov 29 01:27:34 crc kubenswrapper[4749]: I1129 01:27:34.179340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" event={"ID":"54610f30-edc2-4aed-b76f-ba1ce5451386","Type":"ContainerDied","Data":"8c81c24ce588b039d2c00bb4ab26d14dbcea62b38933604d0392dcd8a55b73d3"} Nov 29 01:27:35 crc kubenswrapper[4749]: I1129 01:27:35.191989 4749 generic.go:334] "Generic (PLEG): container finished" podID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerID="5068e3e6c32cd3a67850b7db5f162fa44745a2f367a45d1f39e22d36903d1b38" exitCode=0 Nov 29 01:27:35 crc kubenswrapper[4749]: I1129 01:27:35.192086 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" event={"ID":"54610f30-edc2-4aed-b76f-ba1ce5451386","Type":"ContainerDied","Data":"5068e3e6c32cd3a67850b7db5f162fa44745a2f367a45d1f39e22d36903d1b38"} Nov 29 01:27:35 crc kubenswrapper[4749]: I1129 01:27:35.916488 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kkp2m" Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.527650 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.617417 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-util\") pod \"54610f30-edc2-4aed-b76f-ba1ce5451386\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.617920 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxhks\" (UniqueName: \"kubernetes.io/projected/54610f30-edc2-4aed-b76f-ba1ce5451386-kube-api-access-hxhks\") pod \"54610f30-edc2-4aed-b76f-ba1ce5451386\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.618487 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-bundle\") pod \"54610f30-edc2-4aed-b76f-ba1ce5451386\" (UID: \"54610f30-edc2-4aed-b76f-ba1ce5451386\") " Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.622906 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-bundle" (OuterVolumeSpecName: "bundle") pod "54610f30-edc2-4aed-b76f-ba1ce5451386" (UID: "54610f30-edc2-4aed-b76f-ba1ce5451386"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.637968 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54610f30-edc2-4aed-b76f-ba1ce5451386-kube-api-access-hxhks" (OuterVolumeSpecName: "kube-api-access-hxhks") pod "54610f30-edc2-4aed-b76f-ba1ce5451386" (UID: "54610f30-edc2-4aed-b76f-ba1ce5451386"). InnerVolumeSpecName "kube-api-access-hxhks". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.645379 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-util" (OuterVolumeSpecName: "util") pod "54610f30-edc2-4aed-b76f-ba1ce5451386" (UID: "54610f30-edc2-4aed-b76f-ba1ce5451386"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.720135 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.720188 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54610f30-edc2-4aed-b76f-ba1ce5451386-util\") on node \"crc\" DevicePath \"\"" Nov 29 01:27:36 crc kubenswrapper[4749]: I1129 01:27:36.720219 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxhks\" (UniqueName: \"kubernetes.io/projected/54610f30-edc2-4aed-b76f-ba1ce5451386-kube-api-access-hxhks\") on node \"crc\" DevicePath \"\"" Nov 29 01:27:37 crc kubenswrapper[4749]: I1129 01:27:37.218964 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" event={"ID":"54610f30-edc2-4aed-b76f-ba1ce5451386","Type":"ContainerDied","Data":"1a6d2510afcd6ecd7ce7020fa37ed5292db1e7d2779d627ca8e363302575cf8c"} Nov 29 01:27:37 crc kubenswrapper[4749]: I1129 01:27:37.219522 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a6d2510afcd6ecd7ce7020fa37ed5292db1e7d2779d627ca8e363302575cf8c" Nov 29 01:27:37 crc kubenswrapper[4749]: I1129 01:27:37.219037 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.309127 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b"] Nov 29 01:27:42 crc kubenswrapper[4749]: E1129 01:27:42.309807 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerName="extract" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.309842 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerName="extract" Nov 29 01:27:42 crc kubenswrapper[4749]: E1129 01:27:42.309860 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerName="util" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.309867 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerName="util" Nov 29 01:27:42 crc kubenswrapper[4749]: E1129 01:27:42.309882 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerName="pull" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.309888 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerName="pull" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.310004 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="54610f30-edc2-4aed-b76f-ba1ce5451386" containerName="extract" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.310438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.312664 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.316142 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-j786j" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.316231 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.328906 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b"] Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.409291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8078dba-95c9-4666-b447-3b5b47bdbb3e-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-d745b\" (UID: \"c8078dba-95c9-4666-b447-3b5b47bdbb3e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.409377 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shgg5\" (UniqueName: \"kubernetes.io/projected/c8078dba-95c9-4666-b447-3b5b47bdbb3e-kube-api-access-shgg5\") pod \"cert-manager-operator-controller-manager-64cf6dff88-d745b\" (UID: \"c8078dba-95c9-4666-b447-3b5b47bdbb3e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.511089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8078dba-95c9-4666-b447-3b5b47bdbb3e-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-d745b\" (UID: \"c8078dba-95c9-4666-b447-3b5b47bdbb3e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.511161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shgg5\" (UniqueName: \"kubernetes.io/projected/c8078dba-95c9-4666-b447-3b5b47bdbb3e-kube-api-access-shgg5\") pod \"cert-manager-operator-controller-manager-64cf6dff88-d745b\" (UID: \"c8078dba-95c9-4666-b447-3b5b47bdbb3e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.511917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8078dba-95c9-4666-b447-3b5b47bdbb3e-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-d745b\" (UID: \"c8078dba-95c9-4666-b447-3b5b47bdbb3e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.535082 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shgg5\" (UniqueName: \"kubernetes.io/projected/c8078dba-95c9-4666-b447-3b5b47bdbb3e-kube-api-access-shgg5\") pod \"cert-manager-operator-controller-manager-64cf6dff88-d745b\" (UID: \"c8078dba-95c9-4666-b447-3b5b47bdbb3e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.627638 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" Nov 29 01:27:42 crc kubenswrapper[4749]: I1129 01:27:42.949514 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b"] Nov 29 01:27:42 crc kubenswrapper[4749]: W1129 01:27:42.956205 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8078dba_95c9_4666_b447_3b5b47bdbb3e.slice/crio-3d218a4a0076333ab0372f463936d57e675fb67dde6091add2fc0023ae5021a9 WatchSource:0}: Error finding container 3d218a4a0076333ab0372f463936d57e675fb67dde6091add2fc0023ae5021a9: Status 404 returned error can't find the container with id 3d218a4a0076333ab0372f463936d57e675fb67dde6091add2fc0023ae5021a9 Nov 29 01:27:43 crc kubenswrapper[4749]: I1129 01:27:43.260732 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" event={"ID":"c8078dba-95c9-4666-b447-3b5b47bdbb3e","Type":"ContainerStarted","Data":"3d218a4a0076333ab0372f463936d57e675fb67dde6091add2fc0023ae5021a9"} Nov 29 01:27:45 crc kubenswrapper[4749]: I1129 01:27:45.890441 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rb7rz" Nov 29 01:27:51 crc kubenswrapper[4749]: I1129 01:27:51.336561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" event={"ID":"c8078dba-95c9-4666-b447-3b5b47bdbb3e","Type":"ContainerStarted","Data":"4654e557018f50bbdc31f562ecb0698cd3c4f0912fabb552737cde091f6453d8"} Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.167113 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-d745b" podStartSLOduration=4.986691203 podStartE2EDuration="12.167084218s" podCreationTimestamp="2025-11-29 01:27:42 +0000 UTC" firstStartedPulling="2025-11-29 01:27:42.959839983 +0000 UTC m=+1006.131989840" lastFinishedPulling="2025-11-29 01:27:50.140232988 +0000 UTC m=+1013.312382855" observedRunningTime="2025-11-29 01:27:51.377051198 +0000 UTC m=+1014.549201065" watchObservedRunningTime="2025-11-29 01:27:54.167084218 +0000 UTC m=+1017.339234085" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.171895 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-h9fjh"] Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.172823 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.174821 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.175033 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.176135 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-wlcvw" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.182038 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-h9fjh"] Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.238238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04dcd0b8-9e22-4ffc-b291-d89de26d9afe-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-h9fjh\" (UID: \"04dcd0b8-9e22-4ffc-b291-d89de26d9afe\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.238390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvxl5\" (UniqueName: \"kubernetes.io/projected/04dcd0b8-9e22-4ffc-b291-d89de26d9afe-kube-api-access-fvxl5\") pod \"cert-manager-webhook-f4fb5df64-h9fjh\" (UID: \"04dcd0b8-9e22-4ffc-b291-d89de26d9afe\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.339952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04dcd0b8-9e22-4ffc-b291-d89de26d9afe-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-h9fjh\" (UID: \"04dcd0b8-9e22-4ffc-b291-d89de26d9afe\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.340103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvxl5\" (UniqueName: \"kubernetes.io/projected/04dcd0b8-9e22-4ffc-b291-d89de26d9afe-kube-api-access-fvxl5\") pod \"cert-manager-webhook-f4fb5df64-h9fjh\" (UID: \"04dcd0b8-9e22-4ffc-b291-d89de26d9afe\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.368114 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04dcd0b8-9e22-4ffc-b291-d89de26d9afe-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-h9fjh\" (UID: \"04dcd0b8-9e22-4ffc-b291-d89de26d9afe\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.368397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvxl5\" (UniqueName: \"kubernetes.io/projected/04dcd0b8-9e22-4ffc-b291-d89de26d9afe-kube-api-access-fvxl5\") pod \"cert-manager-webhook-f4fb5df64-h9fjh\" (UID: \"04dcd0b8-9e22-4ffc-b291-d89de26d9afe\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.488822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:27:54 crc kubenswrapper[4749]: I1129 01:27:54.836411 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-h9fjh"] Nov 29 01:27:55 crc kubenswrapper[4749]: I1129 01:27:55.365739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" event={"ID":"04dcd0b8-9e22-4ffc-b291-d89de26d9afe","Type":"ContainerStarted","Data":"a306c36aee5fcac97c5e3b67269288edbfb8df577e7a507040a1b9ba667f64c0"} Nov 29 01:27:56 crc kubenswrapper[4749]: I1129 01:27:56.993517 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7"] Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.036389 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7"] Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.036588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.042010 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j2lsv" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.080533 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d746906-3777-4543-a8c1-8bb4ff61fa60-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-7ztw7\" (UID: \"2d746906-3777-4543-a8c1-8bb4ff61fa60\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.080626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wznpt\" (UniqueName: \"kubernetes.io/projected/2d746906-3777-4543-a8c1-8bb4ff61fa60-kube-api-access-wznpt\") pod \"cert-manager-cainjector-855d9ccff4-7ztw7\" (UID: \"2d746906-3777-4543-a8c1-8bb4ff61fa60\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.181452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wznpt\" (UniqueName: \"kubernetes.io/projected/2d746906-3777-4543-a8c1-8bb4ff61fa60-kube-api-access-wznpt\") pod \"cert-manager-cainjector-855d9ccff4-7ztw7\" (UID: \"2d746906-3777-4543-a8c1-8bb4ff61fa60\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.181576 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d746906-3777-4543-a8c1-8bb4ff61fa60-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-7ztw7\" (UID: \"2d746906-3777-4543-a8c1-8bb4ff61fa60\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.204109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wznpt\" (UniqueName: \"kubernetes.io/projected/2d746906-3777-4543-a8c1-8bb4ff61fa60-kube-api-access-wznpt\") pod \"cert-manager-cainjector-855d9ccff4-7ztw7\" (UID: \"2d746906-3777-4543-a8c1-8bb4ff61fa60\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.222327 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d746906-3777-4543-a8c1-8bb4ff61fa60-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-7ztw7\" (UID: \"2d746906-3777-4543-a8c1-8bb4ff61fa60\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.374573 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j2lsv" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.383023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" Nov 29 01:27:57 crc kubenswrapper[4749]: I1129 01:27:57.598166 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7"] Nov 29 01:27:57 crc kubenswrapper[4749]: W1129 01:27:57.603896 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d746906_3777_4543_a8c1_8bb4ff61fa60.slice/crio-5cb85a008a2d57fea490ac1762f6899dde9bff5bf0b05a1ba8411ae9fce7d322 WatchSource:0}: Error finding container 5cb85a008a2d57fea490ac1762f6899dde9bff5bf0b05a1ba8411ae9fce7d322: Status 404 returned error can't find the container with id 5cb85a008a2d57fea490ac1762f6899dde9bff5bf0b05a1ba8411ae9fce7d322 Nov 29 01:27:58 crc kubenswrapper[4749]: I1129 01:27:58.389488 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" event={"ID":"2d746906-3777-4543-a8c1-8bb4ff61fa60","Type":"ContainerStarted","Data":"5cb85a008a2d57fea490ac1762f6899dde9bff5bf0b05a1ba8411ae9fce7d322"} Nov 29 01:28:04 crc kubenswrapper[4749]: I1129 01:28:04.442373 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" event={"ID":"04dcd0b8-9e22-4ffc-b291-d89de26d9afe","Type":"ContainerStarted","Data":"bfc8193e9f35f7b9fd40b7cda0f8656b228f583f6de17627492b3a43a6453ce7"} Nov 29 01:28:04 crc kubenswrapper[4749]: I1129 01:28:04.443031 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:28:04 crc kubenswrapper[4749]: I1129 01:28:04.445671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" event={"ID":"2d746906-3777-4543-a8c1-8bb4ff61fa60","Type":"ContainerStarted","Data":"028578921bed725b880cf04dd0ade270071edc8a3a203be08353f03f23def176"} Nov 29 01:28:04 crc kubenswrapper[4749]: I1129 01:28:04.470425 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" podStartSLOduration=1.6726665299999999 podStartE2EDuration="10.470393858s" podCreationTimestamp="2025-11-29 01:27:54 +0000 UTC" firstStartedPulling="2025-11-29 01:27:54.842124585 +0000 UTC m=+1018.014274442" lastFinishedPulling="2025-11-29 01:28:03.639851913 +0000 UTC m=+1026.812001770" observedRunningTime="2025-11-29 01:28:04.465327652 +0000 UTC m=+1027.637477519" watchObservedRunningTime="2025-11-29 01:28:04.470393858 +0000 UTC m=+1027.642543725" Nov 29 01:28:04 crc kubenswrapper[4749]: I1129 01:28:04.496582 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-7ztw7" podStartSLOduration=2.48352799 podStartE2EDuration="8.496546201s" podCreationTimestamp="2025-11-29 01:27:56 +0000 UTC" firstStartedPulling="2025-11-29 01:27:57.608871764 +0000 UTC m=+1020.781021621" lastFinishedPulling="2025-11-29 01:28:03.621889975 +0000 UTC m=+1026.794039832" observedRunningTime="2025-11-29 01:28:04.488808998 +0000 UTC m=+1027.660958855" watchObservedRunningTime="2025-11-29 01:28:04.496546201 +0000 UTC m=+1027.668696088" Nov 29 01:28:09 crc kubenswrapper[4749]: I1129 01:28:09.492135 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-h9fjh" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.335148 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-4fxz8"] Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.338089 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-4fxz8" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.341422 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6k25p" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.357838 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-4fxz8"] Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.373277 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/516a7074-ce39-4245-848d-8fc40b801000-bound-sa-token\") pod \"cert-manager-86cb77c54b-4fxz8\" (UID: \"516a7074-ce39-4245-848d-8fc40b801000\") " pod="cert-manager/cert-manager-86cb77c54b-4fxz8" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.373358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlm78\" (UniqueName: \"kubernetes.io/projected/516a7074-ce39-4245-848d-8fc40b801000-kube-api-access-hlm78\") pod \"cert-manager-86cb77c54b-4fxz8\" (UID: \"516a7074-ce39-4245-848d-8fc40b801000\") " pod="cert-manager/cert-manager-86cb77c54b-4fxz8" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.474643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlm78\" (UniqueName: \"kubernetes.io/projected/516a7074-ce39-4245-848d-8fc40b801000-kube-api-access-hlm78\") pod \"cert-manager-86cb77c54b-4fxz8\" (UID: \"516a7074-ce39-4245-848d-8fc40b801000\") " pod="cert-manager/cert-manager-86cb77c54b-4fxz8" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.474759 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/516a7074-ce39-4245-848d-8fc40b801000-bound-sa-token\") pod \"cert-manager-86cb77c54b-4fxz8\" (UID: \"516a7074-ce39-4245-848d-8fc40b801000\") " pod="cert-manager/cert-manager-86cb77c54b-4fxz8" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.501428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/516a7074-ce39-4245-848d-8fc40b801000-bound-sa-token\") pod \"cert-manager-86cb77c54b-4fxz8\" (UID: \"516a7074-ce39-4245-848d-8fc40b801000\") " pod="cert-manager/cert-manager-86cb77c54b-4fxz8" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.505186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlm78\" (UniqueName: \"kubernetes.io/projected/516a7074-ce39-4245-848d-8fc40b801000-kube-api-access-hlm78\") pod \"cert-manager-86cb77c54b-4fxz8\" (UID: \"516a7074-ce39-4245-848d-8fc40b801000\") " pod="cert-manager/cert-manager-86cb77c54b-4fxz8" Nov 29 01:28:13 crc kubenswrapper[4749]: I1129 01:28:13.677252 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-4fxz8" Nov 29 01:28:14 crc kubenswrapper[4749]: I1129 01:28:14.162827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-4fxz8"] Nov 29 01:28:14 crc kubenswrapper[4749]: I1129 01:28:14.532563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-4fxz8" event={"ID":"516a7074-ce39-4245-848d-8fc40b801000","Type":"ContainerStarted","Data":"a7b16ecfc1467b496c2b3968fed3333e828ddd2a1215c88f4c89489957135f51"} Nov 29 01:28:15 crc kubenswrapper[4749]: I1129 01:28:15.546033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-4fxz8" event={"ID":"516a7074-ce39-4245-848d-8fc40b801000","Type":"ContainerStarted","Data":"fedb51103dbf974335d761fad585fd491eab33e4091c4326a7802d5b556b2f6e"} Nov 29 01:28:15 crc kubenswrapper[4749]: I1129 01:28:15.577782 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-4fxz8" podStartSLOduration=2.577744736 podStartE2EDuration="2.577744736s" podCreationTimestamp="2025-11-29 01:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:28:15.571922853 +0000 UTC m=+1038.744072790" watchObservedRunningTime="2025-11-29 01:28:15.577744736 +0000 UTC m=+1038.749894633" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.109658 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2mwp6"] Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.111416 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2mwp6" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.114799 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.114819 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8xl4d" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.115177 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.135180 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2mwp6"] Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.202689 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6wx2\" (UniqueName: \"kubernetes.io/projected/3d03c41c-1ce8-46ea-9109-f4974ea2d69a-kube-api-access-g6wx2\") pod \"openstack-operator-index-2mwp6\" (UID: \"3d03c41c-1ce8-46ea-9109-f4974ea2d69a\") " pod="openstack-operators/openstack-operator-index-2mwp6" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.304251 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6wx2\" (UniqueName: \"kubernetes.io/projected/3d03c41c-1ce8-46ea-9109-f4974ea2d69a-kube-api-access-g6wx2\") pod \"openstack-operator-index-2mwp6\" (UID: \"3d03c41c-1ce8-46ea-9109-f4974ea2d69a\") " pod="openstack-operators/openstack-operator-index-2mwp6" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.328824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6wx2\" (UniqueName: \"kubernetes.io/projected/3d03c41c-1ce8-46ea-9109-f4974ea2d69a-kube-api-access-g6wx2\") pod \"openstack-operator-index-2mwp6\" (UID: \"3d03c41c-1ce8-46ea-9109-f4974ea2d69a\") " pod="openstack-operators/openstack-operator-index-2mwp6" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.437719 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2mwp6" Nov 29 01:28:24 crc kubenswrapper[4749]: I1129 01:28:24.701448 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2mwp6"] Nov 29 01:28:25 crc kubenswrapper[4749]: I1129 01:28:25.654147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2mwp6" event={"ID":"3d03c41c-1ce8-46ea-9109-f4974ea2d69a","Type":"ContainerStarted","Data":"ebfc10c6793748d647d8cf413331e19b2fcfd8a32152e61b0a587f35f93ce8ba"} Nov 29 01:28:27 crc kubenswrapper[4749]: I1129 01:28:27.461984 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2mwp6"] Nov 29 01:28:27 crc kubenswrapper[4749]: I1129 01:28:27.674396 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2mwp6" event={"ID":"3d03c41c-1ce8-46ea-9109-f4974ea2d69a","Type":"ContainerStarted","Data":"4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0"} Nov 29 01:28:27 crc kubenswrapper[4749]: I1129 01:28:27.708856 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2mwp6" podStartSLOduration=1.2356419299999999 podStartE2EDuration="3.708819867s" podCreationTimestamp="2025-11-29 01:28:24 +0000 UTC" firstStartedPulling="2025-11-29 01:28:24.722302107 +0000 UTC m=+1047.894451964" lastFinishedPulling="2025-11-29 01:28:27.195480044 +0000 UTC m=+1050.367629901" observedRunningTime="2025-11-29 01:28:27.698864562 +0000 UTC m=+1050.871014429" watchObservedRunningTime="2025-11-29 01:28:27.708819867 +0000 UTC m=+1050.880969764" Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.079651 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mc697"] Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.081296 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.095241 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mc697"] Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.174103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwvsl\" (UniqueName: \"kubernetes.io/projected/ec8927f6-9db7-4af0-b09b-ecb2e7ebade2-kube-api-access-rwvsl\") pod \"openstack-operator-index-mc697\" (UID: \"ec8927f6-9db7-4af0-b09b-ecb2e7ebade2\") " pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.276491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwvsl\" (UniqueName: \"kubernetes.io/projected/ec8927f6-9db7-4af0-b09b-ecb2e7ebade2-kube-api-access-rwvsl\") pod \"openstack-operator-index-mc697\" (UID: \"ec8927f6-9db7-4af0-b09b-ecb2e7ebade2\") " pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.309578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwvsl\" (UniqueName: \"kubernetes.io/projected/ec8927f6-9db7-4af0-b09b-ecb2e7ebade2-kube-api-access-rwvsl\") pod \"openstack-operator-index-mc697\" (UID: \"ec8927f6-9db7-4af0-b09b-ecb2e7ebade2\") " pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.422380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.698046 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-2mwp6" podUID="3d03c41c-1ce8-46ea-9109-f4974ea2d69a" containerName="registry-server" containerID="cri-o://4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0" gracePeriod=2 Nov 29 01:28:28 crc kubenswrapper[4749]: I1129 01:28:28.816435 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mc697"] Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.092912 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2mwp6" Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.193356 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6wx2\" (UniqueName: \"kubernetes.io/projected/3d03c41c-1ce8-46ea-9109-f4974ea2d69a-kube-api-access-g6wx2\") pod \"3d03c41c-1ce8-46ea-9109-f4974ea2d69a\" (UID: \"3d03c41c-1ce8-46ea-9109-f4974ea2d69a\") " Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.201302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d03c41c-1ce8-46ea-9109-f4974ea2d69a-kube-api-access-g6wx2" (OuterVolumeSpecName: "kube-api-access-g6wx2") pod "3d03c41c-1ce8-46ea-9109-f4974ea2d69a" (UID: "3d03c41c-1ce8-46ea-9109-f4974ea2d69a"). InnerVolumeSpecName "kube-api-access-g6wx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.296233 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6wx2\" (UniqueName: \"kubernetes.io/projected/3d03c41c-1ce8-46ea-9109-f4974ea2d69a-kube-api-access-g6wx2\") on node \"crc\" DevicePath \"\"" Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.709029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mc697" event={"ID":"ec8927f6-9db7-4af0-b09b-ecb2e7ebade2","Type":"ContainerStarted","Data":"35de2bb0b38191594aa7ef353a36ac1679b3fbe169389d0f5c5aa27b25f20b4d"} Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.709131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mc697" event={"ID":"ec8927f6-9db7-4af0-b09b-ecb2e7ebade2","Type":"ContainerStarted","Data":"2e97419f41de4a828b62c259a7eb1362e07b03459d97979e7d8d4fd7563c75e1"} Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.711429 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d03c41c-1ce8-46ea-9109-f4974ea2d69a" containerID="4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0" exitCode=0 Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.711509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2mwp6" event={"ID":"3d03c41c-1ce8-46ea-9109-f4974ea2d69a","Type":"ContainerDied","Data":"4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0"} Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.711555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2mwp6" event={"ID":"3d03c41c-1ce8-46ea-9109-f4974ea2d69a","Type":"ContainerDied","Data":"ebfc10c6793748d647d8cf413331e19b2fcfd8a32152e61b0a587f35f93ce8ba"} Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.711589 4749 scope.go:117] "RemoveContainer" containerID="4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0" Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.711787 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2mwp6" Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.744028 4749 scope.go:117] "RemoveContainer" containerID="4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0" Nov 29 01:28:29 crc kubenswrapper[4749]: E1129 01:28:29.744792 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0\": container with ID starting with 4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0 not found: ID does not exist" containerID="4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0" Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.744851 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0"} err="failed to get container status \"4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0\": rpc error: code = NotFound desc = could not find container \"4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0\": container with ID starting with 4d29aef31246937a85621e78e1d4a5414ab9c982d10809b7485c0e87fcae4ae0 not found: ID does not exist" Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.747620 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mc697" podStartSLOduration=1.695179642 podStartE2EDuration="1.747592469s" podCreationTimestamp="2025-11-29 01:28:28 +0000 UTC" firstStartedPulling="2025-11-29 01:28:28.828762772 +0000 UTC m=+1052.000912629" lastFinishedPulling="2025-11-29 01:28:28.881175579 +0000 UTC m=+1052.053325456" observedRunningTime="2025-11-29 01:28:29.735660166 +0000 UTC m=+1052.907810113" watchObservedRunningTime="2025-11-29 01:28:29.747592469 +0000 UTC m=+1052.919742396" Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.771853 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-2mwp6"] Nov 29 01:28:29 crc kubenswrapper[4749]: I1129 01:28:29.780288 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-2mwp6"] Nov 29 01:28:31 crc kubenswrapper[4749]: I1129 01:28:31.100389 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d03c41c-1ce8-46ea-9109-f4974ea2d69a" path="/var/lib/kubelet/pods/3d03c41c-1ce8-46ea-9109-f4974ea2d69a/volumes" Nov 29 01:28:38 crc kubenswrapper[4749]: I1129 01:28:38.423601 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:38 crc kubenswrapper[4749]: I1129 01:28:38.424828 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:38 crc kubenswrapper[4749]: I1129 01:28:38.476983 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:38 crc kubenswrapper[4749]: I1129 01:28:38.869881 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mc697" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.502714 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw"] Nov 29 01:28:46 crc kubenswrapper[4749]: E1129 01:28:46.503268 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d03c41c-1ce8-46ea-9109-f4974ea2d69a" containerName="registry-server" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.503293 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d03c41c-1ce8-46ea-9109-f4974ea2d69a" containerName="registry-server" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.503615 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d03c41c-1ce8-46ea-9109-f4974ea2d69a" containerName="registry-server" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.505297 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.508769 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-qxldm" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.523092 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw"] Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.634126 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p584\" (UniqueName: \"kubernetes.io/projected/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-kube-api-access-7p584\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.634564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-bundle\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.634681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-util\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.737814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p584\" (UniqueName: \"kubernetes.io/projected/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-kube-api-access-7p584\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.737996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-bundle\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.738046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-util\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.738789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-bundle\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.739009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-util\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.765575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p584\" (UniqueName: \"kubernetes.io/projected/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-kube-api-access-7p584\") pod \"2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:46 crc kubenswrapper[4749]: I1129 01:28:46.838064 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:47 crc kubenswrapper[4749]: I1129 01:28:47.421500 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw"] Nov 29 01:28:47 crc kubenswrapper[4749]: W1129 01:28:47.434966 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecba3528_c9f0_4ec0_8b76_34aad22f4d4b.slice/crio-e82a75ad07f446f8f9e957a93064ad467e77b819ab25866ab093f7a9918ae149 WatchSource:0}: Error finding container e82a75ad07f446f8f9e957a93064ad467e77b819ab25866ab093f7a9918ae149: Status 404 returned error can't find the container with id e82a75ad07f446f8f9e957a93064ad467e77b819ab25866ab093f7a9918ae149 Nov 29 01:28:47 crc kubenswrapper[4749]: I1129 01:28:47.917795 4749 generic.go:334] "Generic (PLEG): container finished" podID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerID="31e584fb8db1b0881ce87fd1b1247d09e2786eaeea9d7a71c46638645e9420f8" exitCode=0 Nov 29 01:28:47 crc kubenswrapper[4749]: I1129 01:28:47.917879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" event={"ID":"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b","Type":"ContainerDied","Data":"31e584fb8db1b0881ce87fd1b1247d09e2786eaeea9d7a71c46638645e9420f8"} Nov 29 01:28:47 crc kubenswrapper[4749]: I1129 01:28:47.919394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" event={"ID":"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b","Type":"ContainerStarted","Data":"e82a75ad07f446f8f9e957a93064ad467e77b819ab25866ab093f7a9918ae149"} Nov 29 01:28:48 crc kubenswrapper[4749]: I1129 01:28:48.935316 4749 generic.go:334] "Generic (PLEG): container finished" podID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerID="522158cbd809cb8cf18119f082725dda4d6aed3afec5fc71a03f4e270d0d8118" exitCode=0 Nov 29 01:28:48 crc kubenswrapper[4749]: I1129 01:28:48.935397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" event={"ID":"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b","Type":"ContainerDied","Data":"522158cbd809cb8cf18119f082725dda4d6aed3afec5fc71a03f4e270d0d8118"} Nov 29 01:28:49 crc kubenswrapper[4749]: I1129 01:28:49.951348 4749 generic.go:334] "Generic (PLEG): container finished" podID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerID="64d2350924e0369c6a7f4035e634d1649ceae49363327ce9e97a838d7e661972" exitCode=0 Nov 29 01:28:49 crc kubenswrapper[4749]: I1129 01:28:49.951424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" event={"ID":"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b","Type":"ContainerDied","Data":"64d2350924e0369c6a7f4035e634d1649ceae49363327ce9e97a838d7e661972"} Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.302277 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.327684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p584\" (UniqueName: \"kubernetes.io/projected/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-kube-api-access-7p584\") pod \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.328321 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-util\") pod \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.328428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-bundle\") pod \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\" (UID: \"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b\") " Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.329430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-bundle" (OuterVolumeSpecName: "bundle") pod "ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" (UID: "ecba3528-c9f0-4ec0-8b76-34aad22f4d4b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.349300 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-util" (OuterVolumeSpecName: "util") pod "ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" (UID: "ecba3528-c9f0-4ec0-8b76-34aad22f4d4b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.368662 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-kube-api-access-7p584" (OuterVolumeSpecName: "kube-api-access-7p584") pod "ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" (UID: "ecba3528-c9f0-4ec0-8b76-34aad22f4d4b"). InnerVolumeSpecName "kube-api-access-7p584". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.433921 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p584\" (UniqueName: \"kubernetes.io/projected/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-kube-api-access-7p584\") on node \"crc\" DevicePath \"\"" Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.433999 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-util\") on node \"crc\" DevicePath \"\"" Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.434028 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ecba3528-c9f0-4ec0-8b76-34aad22f4d4b-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.975599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" event={"ID":"ecba3528-c9f0-4ec0-8b76-34aad22f4d4b","Type":"ContainerDied","Data":"e82a75ad07f446f8f9e957a93064ad467e77b819ab25866ab093f7a9918ae149"} Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.975659 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e82a75ad07f446f8f9e957a93064ad467e77b819ab25866ab093f7a9918ae149" Nov 29 01:28:51 crc kubenswrapper[4749]: I1129 01:28:51.975778 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.499174 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22"] Nov 29 01:28:58 crc kubenswrapper[4749]: E1129 01:28:58.500292 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerName="extract" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.500311 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerName="extract" Nov 29 01:28:58 crc kubenswrapper[4749]: E1129 01:28:58.500328 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerName="util" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.500336 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerName="util" Nov 29 01:28:58 crc kubenswrapper[4749]: E1129 01:28:58.500386 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerName="pull" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.500394 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerName="pull" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.500540 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecba3528-c9f0-4ec0-8b76-34aad22f4d4b" containerName="extract" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.501157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.511518 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-r2d9w" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.530535 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22"] Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.556013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6k6t\" (UniqueName: \"kubernetes.io/projected/24610c1d-41d1-42a1-8aa1-654cf868a283-kube-api-access-f6k6t\") pod \"openstack-operator-controller-operator-6dbf9ff7bd-zvr22\" (UID: \"24610c1d-41d1-42a1-8aa1-654cf868a283\") " pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.657469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6k6t\" (UniqueName: \"kubernetes.io/projected/24610c1d-41d1-42a1-8aa1-654cf868a283-kube-api-access-f6k6t\") pod \"openstack-operator-controller-operator-6dbf9ff7bd-zvr22\" (UID: \"24610c1d-41d1-42a1-8aa1-654cf868a283\") " pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.693824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6k6t\" (UniqueName: \"kubernetes.io/projected/24610c1d-41d1-42a1-8aa1-654cf868a283-kube-api-access-f6k6t\") pod \"openstack-operator-controller-operator-6dbf9ff7bd-zvr22\" (UID: \"24610c1d-41d1-42a1-8aa1-654cf868a283\") " pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" Nov 29 01:28:58 crc kubenswrapper[4749]: I1129 01:28:58.820323 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" Nov 29 01:28:59 crc kubenswrapper[4749]: I1129 01:28:59.146553 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22"] Nov 29 01:28:59 crc kubenswrapper[4749]: W1129 01:28:59.153289 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24610c1d_41d1_42a1_8aa1_654cf868a283.slice/crio-eb3a185e8f21252a71abc2ef11bd90f4877be95f6b92a707ce353f3c5a26024e WatchSource:0}: Error finding container eb3a185e8f21252a71abc2ef11bd90f4877be95f6b92a707ce353f3c5a26024e: Status 404 returned error can't find the container with id eb3a185e8f21252a71abc2ef11bd90f4877be95f6b92a707ce353f3c5a26024e Nov 29 01:28:59 crc kubenswrapper[4749]: I1129 01:28:59.156391 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 01:29:00 crc kubenswrapper[4749]: I1129 01:29:00.057062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" event={"ID":"24610c1d-41d1-42a1-8aa1-654cf868a283","Type":"ContainerStarted","Data":"eb3a185e8f21252a71abc2ef11bd90f4877be95f6b92a707ce353f3c5a26024e"} Nov 29 01:29:05 crc kubenswrapper[4749]: I1129 01:29:05.108139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" event={"ID":"24610c1d-41d1-42a1-8aa1-654cf868a283","Type":"ContainerStarted","Data":"2ff2bbc273863e54d95af30d19386ff609e8662685b4bdb1eaf6ea7e55c1abdd"} Nov 29 01:29:05 crc kubenswrapper[4749]: I1129 01:29:05.109417 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" Nov 29 01:29:05 crc kubenswrapper[4749]: I1129 01:29:05.156792 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" podStartSLOduration=2.125957852 podStartE2EDuration="7.156746749s" podCreationTimestamp="2025-11-29 01:28:58 +0000 UTC" firstStartedPulling="2025-11-29 01:28:59.155886677 +0000 UTC m=+1082.328036534" lastFinishedPulling="2025-11-29 01:29:04.186675574 +0000 UTC m=+1087.358825431" observedRunningTime="2025-11-29 01:29:05.153611192 +0000 UTC m=+1088.325761079" watchObservedRunningTime="2025-11-29 01:29:05.156746749 +0000 UTC m=+1088.328896616" Nov 29 01:29:18 crc kubenswrapper[4749]: I1129 01:29:18.824913 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6dbf9ff7bd-zvr22" Nov 29 01:29:25 crc kubenswrapper[4749]: I1129 01:29:25.375094 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:29:25 crc kubenswrapper[4749]: I1129 01:29:25.376190 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.452086 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.454164 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.457842 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-f62rf" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.465742 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.467107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.468751 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-x5z47" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.469906 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.471233 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.473815 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.475147 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-f4wzg" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.502790 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.509897 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.511247 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.518427 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8bkdt" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.522106 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.545478 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.547004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.550741 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-9st8z" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.598277 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.608606 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.612960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl84k\" (UniqueName: \"kubernetes.io/projected/3bbc7cc2-0efd-4d6f-b424-d1558ed9f040-kube-api-access-gl84k\") pod \"designate-operator-controller-manager-78b4bc895b-x7lmd\" (UID: \"3bbc7cc2-0efd-4d6f-b424-d1558ed9f040\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.613101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrs96\" (UniqueName: \"kubernetes.io/projected/9e0d34d5-9c78-4c5b-8081-e076cde59208-kube-api-access-zrs96\") pod \"glance-operator-controller-manager-668d9c48b9-wm9q9\" (UID: \"9e0d34d5-9c78-4c5b-8081-e076cde59208\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.613154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwpls\" (UniqueName: \"kubernetes.io/projected/3f250151-87d8-495b-895a-c43205c7b8ce-kube-api-access-cwpls\") pod \"barbican-operator-controller-manager-7d9dfd778-5jr7h\" (UID: \"3f250151-87d8-495b-895a-c43205c7b8ce\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.613183 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmjtc\" (UniqueName: \"kubernetes.io/projected/08e10646-6c79-42a1-8180-b2f7595e73ce-kube-api-access-qmjtc\") pod \"cinder-operator-controller-manager-859b6ccc6-6xbh8\" (UID: \"08e10646-6c79-42a1-8180-b2f7595e73ce\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.630122 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.656810 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.659841 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dfgmf" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.695293 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.714912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkpgs\" (UniqueName: \"kubernetes.io/projected/ae863e3f-87c3-4712-9e4d-5fcfa63df10b-kube-api-access-dkpgs\") pod \"heat-operator-controller-manager-5f64f6f8bb-s5vxs\" (UID: \"ae863e3f-87c3-4712-9e4d-5fcfa63df10b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.715382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrs96\" (UniqueName: \"kubernetes.io/projected/9e0d34d5-9c78-4c5b-8081-e076cde59208-kube-api-access-zrs96\") pod \"glance-operator-controller-manager-668d9c48b9-wm9q9\" (UID: \"9e0d34d5-9c78-4c5b-8081-e076cde59208\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.715503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwpls\" (UniqueName: \"kubernetes.io/projected/3f250151-87d8-495b-895a-c43205c7b8ce-kube-api-access-cwpls\") pod \"barbican-operator-controller-manager-7d9dfd778-5jr7h\" (UID: \"3f250151-87d8-495b-895a-c43205c7b8ce\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.715587 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmjtc\" (UniqueName: \"kubernetes.io/projected/08e10646-6c79-42a1-8180-b2f7595e73ce-kube-api-access-qmjtc\") pod \"cinder-operator-controller-manager-859b6ccc6-6xbh8\" (UID: \"08e10646-6c79-42a1-8180-b2f7595e73ce\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.715854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl84k\" (UniqueName: \"kubernetes.io/projected/3bbc7cc2-0efd-4d6f-b424-d1558ed9f040-kube-api-access-gl84k\") pod \"designate-operator-controller-manager-78b4bc895b-x7lmd\" (UID: \"3bbc7cc2-0efd-4d6f-b424-d1558ed9f040\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.715927 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2r6k\" (UniqueName: \"kubernetes.io/projected/0ba380f8-eaae-4987-add4-bdd6aa96f090-kube-api-access-p2r6k\") pod \"horizon-operator-controller-manager-68c6d99b8f-ww9sb\" (UID: \"0ba380f8-eaae-4987-add4-bdd6aa96f090\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.726251 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.727834 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.736437 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.738292 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.741313 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.747349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.749648 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.749954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5zftd" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.750071 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-jp2zk" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.755971 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.757275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.762133 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.766437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwpls\" (UniqueName: \"kubernetes.io/projected/3f250151-87d8-495b-895a-c43205c7b8ce-kube-api-access-cwpls\") pod \"barbican-operator-controller-manager-7d9dfd778-5jr7h\" (UID: \"3f250151-87d8-495b-895a-c43205c7b8ce\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.768498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl84k\" (UniqueName: \"kubernetes.io/projected/3bbc7cc2-0efd-4d6f-b424-d1558ed9f040-kube-api-access-gl84k\") pod \"designate-operator-controller-manager-78b4bc895b-x7lmd\" (UID: \"3bbc7cc2-0efd-4d6f-b424-d1558ed9f040\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.776282 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-f499g"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.777839 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.781956 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-dmvlk" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.787434 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-cdtdf" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.794861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.795967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrs96\" (UniqueName: \"kubernetes.io/projected/9e0d34d5-9c78-4c5b-8081-e076cde59208-kube-api-access-zrs96\") pod \"glance-operator-controller-manager-668d9c48b9-wm9q9\" (UID: \"9e0d34d5-9c78-4c5b-8081-e076cde59208\") " pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.802273 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-f499g"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.808594 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.818372 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.818436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2r6k\" (UniqueName: \"kubernetes.io/projected/0ba380f8-eaae-4987-add4-bdd6aa96f090-kube-api-access-p2r6k\") pod \"horizon-operator-controller-manager-68c6d99b8f-ww9sb\" (UID: \"0ba380f8-eaae-4987-add4-bdd6aa96f090\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.818469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdbvc\" (UniqueName: \"kubernetes.io/projected/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-kube-api-access-gdbvc\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.818521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpkhp\" (UniqueName: \"kubernetes.io/projected/b2daf909-0247-4a43-a96a-a136e5268260-kube-api-access-mpkhp\") pod \"keystone-operator-controller-manager-546d4bdf48-5c65k\" (UID: \"b2daf909-0247-4a43-a96a-a136e5268260\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.818550 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkpgs\" (UniqueName: \"kubernetes.io/projected/ae863e3f-87c3-4712-9e4d-5fcfa63df10b-kube-api-access-dkpgs\") pod \"heat-operator-controller-manager-5f64f6f8bb-s5vxs\" (UID: \"ae863e3f-87c3-4712-9e4d-5fcfa63df10b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.818782 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmf8g\" (UniqueName: \"kubernetes.io/projected/ee9e0c71-281c-41b2-b566-c0222b456f23-kube-api-access-vmf8g\") pod \"ironic-operator-controller-manager-6c548fd776-662nk\" (UID: \"ee9e0c71-281c-41b2-b566-c0222b456f23\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.827707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmjtc\" (UniqueName: \"kubernetes.io/projected/08e10646-6c79-42a1-8180-b2f7595e73ce-kube-api-access-qmjtc\") pod \"cinder-operator-controller-manager-859b6ccc6-6xbh8\" (UID: \"08e10646-6c79-42a1-8180-b2f7595e73ce\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.830614 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.850271 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.851686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.864745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2r6k\" (UniqueName: \"kubernetes.io/projected/0ba380f8-eaae-4987-add4-bdd6aa96f090-kube-api-access-p2r6k\") pod \"horizon-operator-controller-manager-68c6d99b8f-ww9sb\" (UID: \"0ba380f8-eaae-4987-add4-bdd6aa96f090\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.871152 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6fgqt" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.887115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkpgs\" (UniqueName: \"kubernetes.io/projected/ae863e3f-87c3-4712-9e4d-5fcfa63df10b-kube-api-access-dkpgs\") pod \"heat-operator-controller-manager-5f64f6f8bb-s5vxs\" (UID: \"ae863e3f-87c3-4712-9e4d-5fcfa63df10b\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.923424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpkhp\" (UniqueName: \"kubernetes.io/projected/b2daf909-0247-4a43-a96a-a136e5268260-kube-api-access-mpkhp\") pod \"keystone-operator-controller-manager-546d4bdf48-5c65k\" (UID: \"b2daf909-0247-4a43-a96a-a136e5268260\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.923695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9vw\" (UniqueName: \"kubernetes.io/projected/d0754b9d-ce96-4174-83f2-c4436e7d8195-kube-api-access-7m9vw\") pod \"manila-operator-controller-manager-6546668bfd-f499g\" (UID: \"d0754b9d-ce96-4174-83f2-c4436e7d8195\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.924039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmf8g\" (UniqueName: \"kubernetes.io/projected/ee9e0c71-281c-41b2-b566-c0222b456f23-kube-api-access-vmf8g\") pod \"ironic-operator-controller-manager-6c548fd776-662nk\" (UID: \"ee9e0c71-281c-41b2-b566-c0222b456f23\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.924155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.924278 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc94x\" (UniqueName: \"kubernetes.io/projected/8609ba03-25af-49c9-b521-8c637dab5e91-kube-api-access-dc94x\") pod \"mariadb-operator-controller-manager-56bbcc9d85-56rwq\" (UID: \"8609ba03-25af-49c9-b521-8c637dab5e91\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.924367 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdbvc\" (UniqueName: \"kubernetes.io/projected/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-kube-api-access-gdbvc\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:46 crc kubenswrapper[4749]: E1129 01:29:46.924962 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:46 crc kubenswrapper[4749]: E1129 01:29:46.929534 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert podName:c707c92a-5aaa-40ca-a7ae-5ee5db538c3c nodeName:}" failed. No retries permitted until 2025-11-29 01:29:47.4295075 +0000 UTC m=+1130.601657347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert") pod "infra-operator-controller-manager-57548d458d-l9m8x" (UID: "c707c92a-5aaa-40ca-a7ae-5ee5db538c3c") : secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.935860 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.937595 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.941651 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6p4vv" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.959309 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7"] Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.968591 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.981758 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bmqtb" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.997797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdbvc\" (UniqueName: \"kubernetes.io/projected/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-kube-api-access-gdbvc\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.998059 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" Nov 29 01:29:46 crc kubenswrapper[4749]: I1129 01:29:46.998420 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmf8g\" (UniqueName: \"kubernetes.io/projected/ee9e0c71-281c-41b2-b566-c0222b456f23-kube-api-access-vmf8g\") pod \"ironic-operator-controller-manager-6c548fd776-662nk\" (UID: \"ee9e0c71-281c-41b2-b566-c0222b456f23\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.005699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpkhp\" (UniqueName: \"kubernetes.io/projected/b2daf909-0247-4a43-a96a-a136e5268260-kube-api-access-mpkhp\") pod \"keystone-operator-controller-manager-546d4bdf48-5c65k\" (UID: \"b2daf909-0247-4a43-a96a-a136e5268260\") " pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.010186 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.032535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9vw\" (UniqueName: \"kubernetes.io/projected/d0754b9d-ce96-4174-83f2-c4436e7d8195-kube-api-access-7m9vw\") pod \"manila-operator-controller-manager-6546668bfd-f499g\" (UID: \"d0754b9d-ce96-4174-83f2-c4436e7d8195\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.032928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm6dq\" (UniqueName: \"kubernetes.io/projected/9ed58501-79d8-4626-bd9f-dae8a95c872c-kube-api-access-qm6dq\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-l5s92\" (UID: \"9ed58501-79d8-4626-bd9f-dae8a95c872c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.034831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jkvs\" (UniqueName: \"kubernetes.io/projected/2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f-kube-api-access-8jkvs\") pod \"nova-operator-controller-manager-697bc559fc-7hch7\" (UID: \"2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.035033 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc94x\" (UniqueName: \"kubernetes.io/projected/8609ba03-25af-49c9-b521-8c637dab5e91-kube-api-access-dc94x\") pod \"mariadb-operator-controller-manager-56bbcc9d85-56rwq\" (UID: \"8609ba03-25af-49c9-b521-8c637dab5e91\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.037873 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.054313 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.066755 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.068494 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.071932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9vw\" (UniqueName: \"kubernetes.io/projected/d0754b9d-ce96-4174-83f2-c4436e7d8195-kube-api-access-7m9vw\") pod \"manila-operator-controller-manager-6546668bfd-f499g\" (UID: \"d0754b9d-ce96-4174-83f2-c4436e7d8195\") " pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.073959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.075908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc94x\" (UniqueName: \"kubernetes.io/projected/8609ba03-25af-49c9-b521-8c637dab5e91-kube-api-access-dc94x\") pod \"mariadb-operator-controller-manager-56bbcc9d85-56rwq\" (UID: \"8609ba03-25af-49c9-b521-8c637dab5e91\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.080214 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-7wds7" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.120024 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.140055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jkvs\" (UniqueName: \"kubernetes.io/projected/2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f-kube-api-access-8jkvs\") pod \"nova-operator-controller-manager-697bc559fc-7hch7\" (UID: \"2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.140477 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz9zm\" (UniqueName: \"kubernetes.io/projected/2ad01dbb-582f-4074-a985-76067fc2bed3-kube-api-access-xz9zm\") pod \"octavia-operator-controller-manager-998648c74-8lkfs\" (UID: \"2ad01dbb-582f-4074-a985-76067fc2bed3\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.140578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm6dq\" (UniqueName: \"kubernetes.io/projected/9ed58501-79d8-4626-bd9f-dae8a95c872c-kube-api-access-qm6dq\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-l5s92\" (UID: \"9ed58501-79d8-4626-bd9f-dae8a95c872c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.161401 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.162699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.165082 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.178929 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-x24cz" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.188059 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.204113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm6dq\" (UniqueName: \"kubernetes.io/projected/9ed58501-79d8-4626-bd9f-dae8a95c872c-kube-api-access-qm6dq\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-l5s92\" (UID: \"9ed58501-79d8-4626-bd9f-dae8a95c872c\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.248511 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-dm22j"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.249942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.250336 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.250991 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9zm\" (UniqueName: \"kubernetes.io/projected/2ad01dbb-582f-4074-a985-76067fc2bed3-kube-api-access-xz9zm\") pod \"octavia-operator-controller-manager-998648c74-8lkfs\" (UID: \"2ad01dbb-582f-4074-a985-76067fc2bed3\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.251078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9vq7\" (UniqueName: \"kubernetes.io/projected/b08fc4d5-cf16-49c3-b95d-e9175ab67846-kube-api-access-d9vq7\") pod \"ovn-operator-controller-manager-b6456fdb6-2kq6d\" (UID: \"b08fc4d5-cf16-49c3-b95d-e9175ab67846\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.255565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jkvs\" (UniqueName: \"kubernetes.io/projected/2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f-kube-api-access-8jkvs\") pod \"nova-operator-controller-manager-697bc559fc-7hch7\" (UID: \"2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.293072 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.298296 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.299832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-c5kzp" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.300348 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-dm22j"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.300528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.305229 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-dg2kw" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.330043 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.355398 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.356039 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.357752 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.357744 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x5wq\" (UniqueName: \"kubernetes.io/projected/beb586a3-ac88-42b7-b080-8b68cb73bf53-kube-api-access-7x5wq\") pod \"swift-operator-controller-manager-5f8c65bbfc-ldkwb\" (UID: \"beb586a3-ac88-42b7-b080-8b68cb73bf53\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.358066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9vq7\" (UniqueName: \"kubernetes.io/projected/b08fc4d5-cf16-49c3-b95d-e9175ab67846-kube-api-access-d9vq7\") pod \"ovn-operator-controller-manager-b6456fdb6-2kq6d\" (UID: \"b08fc4d5-cf16-49c3-b95d-e9175ab67846\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.358256 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvchq\" (UniqueName: \"kubernetes.io/projected/14d6b00a-a750-4bc4-9d78-12dcefeafe6b-kube-api-access-jvchq\") pod \"placement-operator-controller-manager-78f8948974-dm22j\" (UID: \"14d6b00a-a750-4bc4-9d78-12dcefeafe6b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.368098 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.368338 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-s2nx6" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.370232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.385845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz9zm\" (UniqueName: \"kubernetes.io/projected/2ad01dbb-582f-4074-a985-76067fc2bed3-kube-api-access-xz9zm\") pod \"octavia-operator-controller-manager-998648c74-8lkfs\" (UID: \"2ad01dbb-582f-4074-a985-76067fc2bed3\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.418136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9vq7\" (UniqueName: \"kubernetes.io/projected/b08fc4d5-cf16-49c3-b95d-e9175ab67846-kube-api-access-d9vq7\") pod \"ovn-operator-controller-manager-b6456fdb6-2kq6d\" (UID: \"b08fc4d5-cf16-49c3-b95d-e9175ab67846\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.422176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.426283 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.432166 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.433633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.445971 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-lgf5x" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.446732 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.457814 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.469887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvchq\" (UniqueName: \"kubernetes.io/projected/14d6b00a-a750-4bc4-9d78-12dcefeafe6b-kube-api-access-jvchq\") pod \"placement-operator-controller-manager-78f8948974-dm22j\" (UID: \"14d6b00a-a750-4bc4-9d78-12dcefeafe6b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.469972 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b5b4\" (UniqueName: \"kubernetes.io/projected/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-kube-api-access-5b5b4\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.469997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrjn\" (UniqueName: \"kubernetes.io/projected/a493c4bc-b7d4-4e55-bc8f-205242be99eb-kube-api-access-wqrjn\") pod \"telemetry-operator-controller-manager-76cc84c6bb-fg7sg\" (UID: \"a493c4bc-b7d4-4e55-bc8f-205242be99eb\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.470074 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.470102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.470153 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x5wq\" (UniqueName: \"kubernetes.io/projected/beb586a3-ac88-42b7-b080-8b68cb73bf53-kube-api-access-7x5wq\") pod \"swift-operator-controller-manager-5f8c65bbfc-ldkwb\" (UID: \"beb586a3-ac88-42b7-b080-8b68cb73bf53\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" Nov 29 01:29:47 crc kubenswrapper[4749]: E1129 01:29:47.470462 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:47 crc kubenswrapper[4749]: E1129 01:29:47.470527 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert podName:c707c92a-5aaa-40ca-a7ae-5ee5db538c3c nodeName:}" failed. No retries permitted until 2025-11-29 01:29:48.470502382 +0000 UTC m=+1131.642652239 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert") pod "infra-operator-controller-manager-57548d458d-l9m8x" (UID: "c707c92a-5aaa-40ca-a7ae-5ee5db538c3c") : secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.473158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.487316 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.488946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.492920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x5wq\" (UniqueName: \"kubernetes.io/projected/beb586a3-ac88-42b7-b080-8b68cb73bf53-kube-api-access-7x5wq\") pod \"swift-operator-controller-manager-5f8c65bbfc-ldkwb\" (UID: \"beb586a3-ac88-42b7-b080-8b68cb73bf53\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.493688 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-v46b8" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.494285 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.501742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvchq\" (UniqueName: \"kubernetes.io/projected/14d6b00a-a750-4bc4-9d78-12dcefeafe6b-kube-api-access-jvchq\") pod \"placement-operator-controller-manager-78f8948974-dm22j\" (UID: \"14d6b00a-a750-4bc4-9d78-12dcefeafe6b\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.509609 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.537385 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.540208 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.543077 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-49qrk" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.572900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqnc\" (UniqueName: \"kubernetes.io/projected/913736f1-2790-4ac3-a478-58de73caee8f-kube-api-access-5wqnc\") pod \"test-operator-controller-manager-5854674fcc-ssmp5\" (UID: \"913736f1-2790-4ac3-a478-58de73caee8f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.573189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b5b4\" (UniqueName: \"kubernetes.io/projected/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-kube-api-access-5b5b4\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.573287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrjn\" (UniqueName: \"kubernetes.io/projected/a493c4bc-b7d4-4e55-bc8f-205242be99eb-kube-api-access-wqrjn\") pod \"telemetry-operator-controller-manager-76cc84c6bb-fg7sg\" (UID: \"a493c4bc-b7d4-4e55-bc8f-205242be99eb\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.573356 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qxz7\" (UniqueName: \"kubernetes.io/projected/88cd9373-83ec-44e6-b108-04d0b853b5da-kube-api-access-4qxz7\") pod \"watcher-operator-controller-manager-769dc69bc-6jvjq\" (UID: \"88cd9373-83ec-44e6-b108-04d0b853b5da\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.573769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:47 crc kubenswrapper[4749]: E1129 01:29:47.575888 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:47 crc kubenswrapper[4749]: E1129 01:29:47.575983 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert podName:76c23b91-6df4-41e0-bcd3-eacc7e879aeb nodeName:}" failed. No retries permitted until 2025-11-29 01:29:48.075959121 +0000 UTC m=+1131.248108978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" (UID: "76c23b91-6df4-41e0-bcd3-eacc7e879aeb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.593548 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.607715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b5b4\" (UniqueName: \"kubernetes.io/projected/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-kube-api-access-5b5b4\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.612508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrjn\" (UniqueName: \"kubernetes.io/projected/a493c4bc-b7d4-4e55-bc8f-205242be99eb-kube-api-access-wqrjn\") pod \"telemetry-operator-controller-manager-76cc84c6bb-fg7sg\" (UID: \"a493c4bc-b7d4-4e55-bc8f-205242be99eb\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.624517 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.625799 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.655360 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.655883 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.655930 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-dtvvd" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.658114 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.676696 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.688984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqnc\" (UniqueName: \"kubernetes.io/projected/913736f1-2790-4ac3-a478-58de73caee8f-kube-api-access-5wqnc\") pod \"test-operator-controller-manager-5854674fcc-ssmp5\" (UID: \"913736f1-2790-4ac3-a478-58de73caee8f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.689069 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwgm4\" (UniqueName: \"kubernetes.io/projected/be884b00-8556-44c8-83e8-c851267b63e2-kube-api-access-lwgm4\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.689180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qxz7\" (UniqueName: \"kubernetes.io/projected/88cd9373-83ec-44e6-b108-04d0b853b5da-kube-api-access-4qxz7\") pod \"watcher-operator-controller-manager-769dc69bc-6jvjq\" (UID: \"88cd9373-83ec-44e6-b108-04d0b853b5da\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.689411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.716054 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.716803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqnc\" (UniqueName: \"kubernetes.io/projected/913736f1-2790-4ac3-a478-58de73caee8f-kube-api-access-5wqnc\") pod \"test-operator-controller-manager-5854674fcc-ssmp5\" (UID: \"913736f1-2790-4ac3-a478-58de73caee8f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.717579 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.717885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qxz7\" (UniqueName: \"kubernetes.io/projected/88cd9373-83ec-44e6-b108-04d0b853b5da-kube-api-access-4qxz7\") pod \"watcher-operator-controller-manager-769dc69bc-6jvjq\" (UID: \"88cd9373-83ec-44e6-b108-04d0b853b5da\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.720657 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-rgr2f" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.767847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.775583 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.790278 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.791928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs22h\" (UniqueName: \"kubernetes.io/projected/872c8278-f904-4ef0-8180-46fd4beea0dd-kube-api-access-hs22h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kcnjp\" (UID: \"872c8278-f904-4ef0-8180-46fd4beea0dd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.792111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwgm4\" (UniqueName: \"kubernetes.io/projected/be884b00-8556-44c8-83e8-c851267b63e2-kube-api-access-lwgm4\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.792412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.792491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:47 crc kubenswrapper[4749]: E1129 01:29:47.792860 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 01:29:47 crc kubenswrapper[4749]: E1129 01:29:47.793214 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:29:48.292912056 +0000 UTC m=+1131.465062123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "webhook-server-cert" not found Nov 29 01:29:47 crc kubenswrapper[4749]: E1129 01:29:47.794566 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 01:29:47 crc kubenswrapper[4749]: E1129 01:29:47.794695 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:29:48.294671089 +0000 UTC m=+1131.466820946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "metrics-server-cert" not found Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.819647 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwgm4\" (UniqueName: \"kubernetes.io/projected/be884b00-8556-44c8-83e8-c851267b63e2-kube-api-access-lwgm4\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.850057 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.894891 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs22h\" (UniqueName: \"kubernetes.io/projected/872c8278-f904-4ef0-8180-46fd4beea0dd-kube-api-access-hs22h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kcnjp\" (UID: \"872c8278-f904-4ef0-8180-46fd4beea0dd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.906688 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.924725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.926625 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd"] Nov 29 01:29:47 crc kubenswrapper[4749]: I1129 01:29:47.939628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs22h\" (UniqueName: \"kubernetes.io/projected/872c8278-f904-4ef0-8180-46fd4beea0dd-kube-api-access-hs22h\") pod \"rabbitmq-cluster-operator-manager-668c99d594-kcnjp\" (UID: \"872c8278-f904-4ef0-8180-46fd4beea0dd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.075232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.101228 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.101495 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.101598 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert podName:76c23b91-6df4-41e0-bcd3-eacc7e879aeb nodeName:}" failed. No retries permitted until 2025-11-29 01:29:49.101568854 +0000 UTC m=+1132.273718841 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" (UID: "76c23b91-6df4-41e0-bcd3-eacc7e879aeb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.192136 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.212508 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.305483 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.305559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.305700 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.305786 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:29:49.305765207 +0000 UTC m=+1132.477915064 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "webhook-server-cert" not found Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.305867 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.305943 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:29:49.305921191 +0000 UTC m=+1132.478071048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "metrics-server-cert" not found Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.469964 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" event={"ID":"9e0d34d5-9c78-4c5b-8081-e076cde59208","Type":"ContainerStarted","Data":"e3ab450f166cc18eb4695ae96d6ad09a1dd2803309ec3ce5c7b27c840a045051"} Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.473068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" event={"ID":"3f250151-87d8-495b-895a-c43205c7b8ce","Type":"ContainerStarted","Data":"b4bde3b12b8d4d781c8eabc5cef346f12339752a61767df3300c4a5b5167b96d"} Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.474493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" event={"ID":"3bbc7cc2-0efd-4d6f-b424-d1558ed9f040","Type":"ContainerStarted","Data":"59f93a6cdff89389ab77e3fb96fa7953338534bc2b45d80a42430e0e6832c1af"} Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.508350 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.508665 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.508738 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert podName:c707c92a-5aaa-40ca-a7ae-5ee5db538c3c nodeName:}" failed. No retries permitted until 2025-11-29 01:29:50.508716349 +0000 UTC m=+1133.680866206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert") pod "infra-operator-controller-manager-57548d458d-l9m8x" (UID: "c707c92a-5aaa-40ca-a7ae-5ee5db538c3c") : secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.571448 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.594168 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.605541 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq"] Nov 29 01:29:48 crc kubenswrapper[4749]: W1129 01:29:48.622669 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8609ba03_25af_49c9_b521_8c637dab5e91.slice/crio-734a63b01d4358a8353b18ff51580087a4b67187ea183a8ebec4bb66ae539a37 WatchSource:0}: Error finding container 734a63b01d4358a8353b18ff51580087a4b67187ea183a8ebec4bb66ae539a37: Status 404 returned error can't find the container with id 734a63b01d4358a8353b18ff51580087a4b67187ea183a8ebec4bb66ae539a37 Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.624216 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6546668bfd-f499g"] Nov 29 01:29:48 crc kubenswrapper[4749]: W1129 01:29:48.627236 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae863e3f_87c3_4712_9e4d_5fcfa63df10b.slice/crio-8c9efd8ba1ac2b63a6b955e20654362b40d95d6975feb28d0b2dfee48c3b2e51 WatchSource:0}: Error finding container 8c9efd8ba1ac2b63a6b955e20654362b40d95d6975feb28d0b2dfee48c3b2e51: Status 404 returned error can't find the container with id 8c9efd8ba1ac2b63a6b955e20654362b40d95d6975feb28d0b2dfee48c3b2e51 Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.633745 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.650949 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.657244 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.661981 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.666327 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8"] Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.818356 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs"] Nov 29 01:29:48 crc kubenswrapper[4749]: W1129 01:29:48.840484 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod913736f1_2790_4ac3_a478_58de73caee8f.slice/crio-7e900b776b88217bc58a316acf61d940d5aaf04bae3ae979a226743d919810de WatchSource:0}: Error finding container 7e900b776b88217bc58a316acf61d940d5aaf04bae3ae979a226743d919810de: Status 404 returned error can't find the container with id 7e900b776b88217bc58a316acf61d940d5aaf04bae3ae979a226743d919810de Nov 29 01:29:48 crc kubenswrapper[4749]: W1129 01:29:48.848780 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14d6b00a_a750_4bc4_9d78_12dcefeafe6b.slice/crio-c29a75611139582d578226c24a76f7b445027c480996aa90df18714f07a3b218 WatchSource:0}: Error finding container c29a75611139582d578226c24a76f7b445027c480996aa90df18714f07a3b218: Status 404 returned error can't find the container with id c29a75611139582d578226c24a76f7b445027c480996aa90df18714f07a3b218 Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.851357 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp"] Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.854512 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5wqnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-ssmp5_openstack-operators(913736f1-2790-4ac3-a478-58de73caee8f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.857860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-dm22j"] Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.857940 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5wqnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-ssmp5_openstack-operators(913736f1-2790-4ac3-a478-58de73caee8f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.859026 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" podUID="913736f1-2790-4ac3-a478-58de73caee8f" Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.863415 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb"] Nov 29 01:29:48 crc kubenswrapper[4749]: W1129 01:29:48.870544 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod872c8278_f904_4ef0_8180_46fd4beea0dd.slice/crio-dc45771327bd07e7fa413a15977d4b7530601c4e957b22b05d9c0374f9d6f38b WatchSource:0}: Error finding container dc45771327bd07e7fa413a15977d4b7530601c4e957b22b05d9c0374f9d6f38b: Status 404 returned error can't find the container with id dc45771327bd07e7fa413a15977d4b7530601c4e957b22b05d9c0374f9d6f38b Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.871101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5"] Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.871658 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7x5wq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-ldkwb_openstack-operators(beb586a3-ac88-42b7-b080-8b68cb73bf53): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: W1129 01:29:48.873102 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda493c4bc_b7d4_4e55_bc8f_205242be99eb.slice/crio-d18b1d651073e13f7ddebcec38dc1c03300698d6b42c3dc9584a9ad72b56be16 WatchSource:0}: Error finding container d18b1d651073e13f7ddebcec38dc1c03300698d6b42c3dc9584a9ad72b56be16: Status 404 returned error can't find the container with id d18b1d651073e13f7ddebcec38dc1c03300698d6b42c3dc9584a9ad72b56be16 Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.873300 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hs22h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-kcnjp_openstack-operators(872c8278-f904-4ef0-8180-46fd4beea0dd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.873914 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7x5wq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-ldkwb_openstack-operators(beb586a3-ac88-42b7-b080-8b68cb73bf53): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.875122 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" podUID="beb586a3-ac88-42b7-b080-8b68cb73bf53" Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.875238 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq"] Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.875427 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" podUID="872c8278-f904-4ef0-8180-46fd4beea0dd" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.875483 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wqrjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-fg7sg_openstack-operators(a493c4bc-b7d4-4e55-bc8f-205242be99eb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.878035 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4qxz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-6jvjq_openstack-operators(88cd9373-83ec-44e6-b108-04d0b853b5da): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.878884 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wqrjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-fg7sg_openstack-operators(a493c4bc-b7d4-4e55-bc8f-205242be99eb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.880161 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" podUID="a493c4bc-b7d4-4e55-bc8f-205242be99eb" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.881683 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4qxz7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-6jvjq_openstack-operators(88cd9373-83ec-44e6-b108-04d0b853b5da): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.883019 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" podUID="88cd9373-83ec-44e6-b108-04d0b853b5da" Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.884622 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg"] Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.889521 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d9vq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-2kq6d_openstack-operators(b08fc4d5-cf16-49c3-b95d-e9175ab67846): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.891439 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d9vq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-2kq6d_openstack-operators(b08fc4d5-cf16-49c3-b95d-e9175ab67846): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:29:48 crc kubenswrapper[4749]: E1129 01:29:48.892710 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" podUID="b08fc4d5-cf16-49c3-b95d-e9175ab67846" Nov 29 01:29:48 crc kubenswrapper[4749]: I1129 01:29:48.892843 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d"] Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.118380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.118646 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.118707 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert podName:76c23b91-6df4-41e0-bcd3-eacc7e879aeb nodeName:}" failed. No retries permitted until 2025-11-29 01:29:51.118692025 +0000 UTC m=+1134.290841882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" (UID: "76c23b91-6df4-41e0-bcd3-eacc7e879aeb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.322103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.322672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.322285 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.322796 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:29:51.322769885 +0000 UTC m=+1134.494919742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "webhook-server-cert" not found Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.322841 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.322924 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:29:51.322902948 +0000 UTC m=+1134.495052795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "metrics-server-cert" not found Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.502281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" event={"ID":"08e10646-6c79-42a1-8180-b2f7595e73ce","Type":"ContainerStarted","Data":"9381a105c189aa5f044c4cc61a11238d4aa5bac9ac1c6f2ed455fa7a5214978e"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.504437 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" event={"ID":"a493c4bc-b7d4-4e55-bc8f-205242be99eb","Type":"ContainerStarted","Data":"d18b1d651073e13f7ddebcec38dc1c03300698d6b42c3dc9584a9ad72b56be16"} Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.507812 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" podUID="a493c4bc-b7d4-4e55-bc8f-205242be99eb" Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.511498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" event={"ID":"b08fc4d5-cf16-49c3-b95d-e9175ab67846","Type":"ContainerStarted","Data":"1e0535d2ec439662707b765f55428c3f105415bef680beaf19cdf2a32c06f4b3"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.515175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" event={"ID":"913736f1-2790-4ac3-a478-58de73caee8f","Type":"ContainerStarted","Data":"7e900b776b88217bc58a316acf61d940d5aaf04bae3ae979a226743d919810de"} Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.515931 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" podUID="b08fc4d5-cf16-49c3-b95d-e9175ab67846" Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.540375 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" podUID="913736f1-2790-4ac3-a478-58de73caee8f" Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.541624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" event={"ID":"9ed58501-79d8-4626-bd9f-dae8a95c872c","Type":"ContainerStarted","Data":"61b35c1fef97bfc7ef03044bd8c42c5cf2c377daed8a103ee2ff582dedc387b3"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.565103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" event={"ID":"2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f","Type":"ContainerStarted","Data":"d2e1db2a6f05ee7770b088884523b14ba3386e9c010050c5cebab7ab1830934f"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.570257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" event={"ID":"2ad01dbb-582f-4074-a985-76067fc2bed3","Type":"ContainerStarted","Data":"f965894cc49c6fdae159102292baad5bb7b6093918fc54c7907ab1e9a9086002"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.582105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" event={"ID":"0ba380f8-eaae-4987-add4-bdd6aa96f090","Type":"ContainerStarted","Data":"ea04d6da35226e907b67a025a5b300f2c3da08388852996f5f0ea1059f458eeb"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.589113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" event={"ID":"d0754b9d-ce96-4174-83f2-c4436e7d8195","Type":"ContainerStarted","Data":"5fbcf90f241779150e713580d9c685453ae29ba75f4debd412d692f00bb4d233"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.603845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" event={"ID":"ee9e0c71-281c-41b2-b566-c0222b456f23","Type":"ContainerStarted","Data":"5d7bf0a48c5b586f4b6c5fa09ad1312ddf7b87ae6f1c4a8dd8c3d3f2d51b2ce0"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.606581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" event={"ID":"8609ba03-25af-49c9-b521-8c637dab5e91","Type":"ContainerStarted","Data":"734a63b01d4358a8353b18ff51580087a4b67187ea183a8ebec4bb66ae539a37"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.610777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" event={"ID":"ae863e3f-87c3-4712-9e4d-5fcfa63df10b","Type":"ContainerStarted","Data":"8c9efd8ba1ac2b63a6b955e20654362b40d95d6975feb28d0b2dfee48c3b2e51"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.615513 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" event={"ID":"88cd9373-83ec-44e6-b108-04d0b853b5da","Type":"ContainerStarted","Data":"646fdf3fb8367089f37bdd4171342fd1f3fc7aaf364bce526e3e5cb95e8f191a"} Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.620922 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" podUID="88cd9373-83ec-44e6-b108-04d0b853b5da" Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.622304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" event={"ID":"b2daf909-0247-4a43-a96a-a136e5268260","Type":"ContainerStarted","Data":"e88929d593f3da6aa63c72fcc44c5827e4a9634abf6ac193c76a9c393a001efe"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.625442 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" event={"ID":"14d6b00a-a750-4bc4-9d78-12dcefeafe6b","Type":"ContainerStarted","Data":"c29a75611139582d578226c24a76f7b445027c480996aa90df18714f07a3b218"} Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.629673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" event={"ID":"beb586a3-ac88-42b7-b080-8b68cb73bf53","Type":"ContainerStarted","Data":"f122e0779078f7c45a8735bd005ca7ae0c414eeaa043afba338d6c11d7b2102c"} Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.638631 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" podUID="beb586a3-ac88-42b7-b080-8b68cb73bf53" Nov 29 01:29:49 crc kubenswrapper[4749]: I1129 01:29:49.642057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" event={"ID":"872c8278-f904-4ef0-8180-46fd4beea0dd","Type":"ContainerStarted","Data":"dc45771327bd07e7fa413a15977d4b7530601c4e957b22b05d9c0374f9d6f38b"} Nov 29 01:29:49 crc kubenswrapper[4749]: E1129 01:29:49.646471 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" podUID="872c8278-f904-4ef0-8180-46fd4beea0dd" Nov 29 01:29:50 crc kubenswrapper[4749]: I1129 01:29:50.550988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:50 crc kubenswrapper[4749]: E1129 01:29:50.551258 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:50 crc kubenswrapper[4749]: E1129 01:29:50.551337 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert podName:c707c92a-5aaa-40ca-a7ae-5ee5db538c3c nodeName:}" failed. No retries permitted until 2025-11-29 01:29:54.551314726 +0000 UTC m=+1137.723464583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert") pod "infra-operator-controller-manager-57548d458d-l9m8x" (UID: "c707c92a-5aaa-40ca-a7ae-5ee5db538c3c") : secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:50 crc kubenswrapper[4749]: E1129 01:29:50.665108 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" podUID="beb586a3-ac88-42b7-b080-8b68cb73bf53" Nov 29 01:29:50 crc kubenswrapper[4749]: E1129 01:29:50.666239 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" podUID="88cd9373-83ec-44e6-b108-04d0b853b5da" Nov 29 01:29:50 crc kubenswrapper[4749]: E1129 01:29:50.666395 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" podUID="913736f1-2790-4ac3-a478-58de73caee8f" Nov 29 01:29:50 crc kubenswrapper[4749]: E1129 01:29:50.666476 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" podUID="872c8278-f904-4ef0-8180-46fd4beea0dd" Nov 29 01:29:50 crc kubenswrapper[4749]: E1129 01:29:50.666565 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" podUID="a493c4bc-b7d4-4e55-bc8f-205242be99eb" Nov 29 01:29:50 crc kubenswrapper[4749]: E1129 01:29:50.666564 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" podUID="b08fc4d5-cf16-49c3-b95d-e9175ab67846" Nov 29 01:29:51 crc kubenswrapper[4749]: I1129 01:29:51.160179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:51 crc kubenswrapper[4749]: E1129 01:29:51.160493 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:51 crc kubenswrapper[4749]: E1129 01:29:51.160555 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert podName:76c23b91-6df4-41e0-bcd3-eacc7e879aeb nodeName:}" failed. No retries permitted until 2025-11-29 01:29:55.160536763 +0000 UTC m=+1138.332686620 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" (UID: "76c23b91-6df4-41e0-bcd3-eacc7e879aeb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:51 crc kubenswrapper[4749]: I1129 01:29:51.363862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:51 crc kubenswrapper[4749]: I1129 01:29:51.363922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:51 crc kubenswrapper[4749]: E1129 01:29:51.364029 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 01:29:51 crc kubenswrapper[4749]: E1129 01:29:51.364107 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:29:55.364086379 +0000 UTC m=+1138.536236236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "webhook-server-cert" not found Nov 29 01:29:51 crc kubenswrapper[4749]: E1129 01:29:51.364209 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 01:29:51 crc kubenswrapper[4749]: E1129 01:29:51.364323 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:29:55.364296914 +0000 UTC m=+1138.536446771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "metrics-server-cert" not found Nov 29 01:29:54 crc kubenswrapper[4749]: I1129 01:29:54.622620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:29:54 crc kubenswrapper[4749]: E1129 01:29:54.623982 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:54 crc kubenswrapper[4749]: E1129 01:29:54.624361 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert podName:c707c92a-5aaa-40ca-a7ae-5ee5db538c3c nodeName:}" failed. No retries permitted until 2025-11-29 01:30:02.62433257 +0000 UTC m=+1145.796482447 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert") pod "infra-operator-controller-manager-57548d458d-l9m8x" (UID: "c707c92a-5aaa-40ca-a7ae-5ee5db538c3c") : secret "infra-operator-webhook-server-cert" not found Nov 29 01:29:55 crc kubenswrapper[4749]: I1129 01:29:55.234566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:29:55 crc kubenswrapper[4749]: E1129 01:29:55.234813 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:55 crc kubenswrapper[4749]: E1129 01:29:55.234955 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert podName:76c23b91-6df4-41e0-bcd3-eacc7e879aeb nodeName:}" failed. No retries permitted until 2025-11-29 01:30:03.234920939 +0000 UTC m=+1146.407070836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" (UID: "76c23b91-6df4-41e0-bcd3-eacc7e879aeb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 29 01:29:55 crc kubenswrapper[4749]: I1129 01:29:55.374797 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:29:55 crc kubenswrapper[4749]: I1129 01:29:55.374882 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:29:55 crc kubenswrapper[4749]: I1129 01:29:55.437416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:55 crc kubenswrapper[4749]: I1129 01:29:55.437475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:29:55 crc kubenswrapper[4749]: E1129 01:29:55.437624 4749 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 29 01:29:55 crc kubenswrapper[4749]: E1129 01:29:55.437688 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:30:03.437668627 +0000 UTC m=+1146.609818484 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "metrics-server-cert" not found Nov 29 01:29:55 crc kubenswrapper[4749]: E1129 01:29:55.438339 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 29 01:29:55 crc kubenswrapper[4749]: E1129 01:29:55.438492 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs podName:be884b00-8556-44c8-83e8-c851267b63e2 nodeName:}" failed. No retries permitted until 2025-11-29 01:30:03.438455986 +0000 UTC m=+1146.610605873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs") pod "openstack-operator-controller-manager-6d7d7b9964-wws7p" (UID: "be884b00-8556-44c8-83e8-c851267b63e2") : secret "webhook-server-cert" not found Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.149224 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh"] Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.151307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.154462 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.154663 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.158708 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh"] Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.235713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfh8w\" (UniqueName: \"kubernetes.io/projected/a3ad010e-7987-4781-8c42-3dbbb4006be8-kube-api-access-mfh8w\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.236293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3ad010e-7987-4781-8c42-3dbbb4006be8-config-volume\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.236336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3ad010e-7987-4781-8c42-3dbbb4006be8-secret-volume\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.336879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3ad010e-7987-4781-8c42-3dbbb4006be8-config-volume\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.336958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3ad010e-7987-4781-8c42-3dbbb4006be8-secret-volume\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.336998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfh8w\" (UniqueName: \"kubernetes.io/projected/a3ad010e-7987-4781-8c42-3dbbb4006be8-kube-api-access-mfh8w\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.339238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3ad010e-7987-4781-8c42-3dbbb4006be8-config-volume\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.352028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3ad010e-7987-4781-8c42-3dbbb4006be8-secret-volume\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.357410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfh8w\" (UniqueName: \"kubernetes.io/projected/a3ad010e-7987-4781-8c42-3dbbb4006be8-kube-api-access-mfh8w\") pod \"collect-profiles-29406330-bfhgh\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:00 crc kubenswrapper[4749]: I1129 01:30:00.507742 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.114574 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh"] Nov 29 01:30:01 crc kubenswrapper[4749]: W1129 01:30:01.146830 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3ad010e_7987_4781_8c42_3dbbb4006be8.slice/crio-82816c0672bc529eb514223e90ef9cd5fc1617a6c73030169d76fb53ba0ea936 WatchSource:0}: Error finding container 82816c0672bc529eb514223e90ef9cd5fc1617a6c73030169d76fb53ba0ea936: Status 404 returned error can't find the container with id 82816c0672bc529eb514223e90ef9cd5fc1617a6c73030169d76fb53ba0ea936 Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.756498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" event={"ID":"9ed58501-79d8-4626-bd9f-dae8a95c872c","Type":"ContainerStarted","Data":"56eb59df6880b18763f02d344825be38a48f58a41c7fb64cd6a0490cc324c055"} Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.764983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" event={"ID":"2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f","Type":"ContainerStarted","Data":"e0baa273d37d5ccc58b2115bce6816d9dd8b77cd293280ca3a77dc8dd7798506"} Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.769781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" event={"ID":"9e0d34d5-9c78-4c5b-8081-e076cde59208","Type":"ContainerStarted","Data":"883c66e9f12402a4a0140e336996fec898e8de5b1e6def42378c296633fc3e9b"} Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.776217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" event={"ID":"0ba380f8-eaae-4987-add4-bdd6aa96f090","Type":"ContainerStarted","Data":"b8c56ff4ccc37c96e459b6f2d90facca148e268dddafbde4af1b1ba0143e4c0a"} Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.785843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" event={"ID":"d0754b9d-ce96-4174-83f2-c4436e7d8195","Type":"ContainerStarted","Data":"757963def8f6f0139aeb2d3c880a1062e75582fb0f734f6506fc007a3739adee"} Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.792088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" event={"ID":"ee9e0c71-281c-41b2-b566-c0222b456f23","Type":"ContainerStarted","Data":"1b9964cb3d91ee679f0d2a24f0c3accd0f264f91f582631aad6e44ecd5b28134"} Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.803767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" event={"ID":"2ad01dbb-582f-4074-a985-76067fc2bed3","Type":"ContainerStarted","Data":"4088879c1b3174f0c7673d56ea4879b67ead5ca87aee30168b3fc34724294ca2"} Nov 29 01:30:01 crc kubenswrapper[4749]: E1129 01:30:01.813876 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jvchq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-dm22j_openstack-operators(14d6b00a-a750-4bc4-9d78-12dcefeafe6b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.814493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" event={"ID":"a3ad010e-7987-4781-8c42-3dbbb4006be8","Type":"ContainerStarted","Data":"82816c0672bc529eb514223e90ef9cd5fc1617a6c73030169d76fb53ba0ea936"} Nov 29 01:30:01 crc kubenswrapper[4749]: E1129 01:30:01.815466 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" podUID="14d6b00a-a750-4bc4-9d78-12dcefeafe6b" Nov 29 01:30:01 crc kubenswrapper[4749]: E1129 01:30:01.839253 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dkpgs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-s5vxs_openstack-operators(ae863e3f-87c3-4712-9e4d-5fcfa63df10b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:30:01 crc kubenswrapper[4749]: E1129 01:30:01.842309 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" podUID="ae863e3f-87c3-4712-9e4d-5fcfa63df10b" Nov 29 01:30:01 crc kubenswrapper[4749]: E1129 01:30:01.844044 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mpkhp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-546d4bdf48-5c65k_openstack-operators(b2daf909-0247-4a43-a96a-a136e5268260): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.844665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" event={"ID":"3f250151-87d8-495b-895a-c43205c7b8ce","Type":"ContainerStarted","Data":"84abf5fdba206cacbfd18d5d582f81143d5f93e1270fb44d181aad3718ee8ab9"} Nov 29 01:30:01 crc kubenswrapper[4749]: E1129 01:30:01.846344 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" podUID="b2daf909-0247-4a43-a96a-a136e5268260" Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.862910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" event={"ID":"8609ba03-25af-49c9-b521-8c637dab5e91","Type":"ContainerStarted","Data":"2486f2627af11bd6a3bf5b1012e9bb770d05399b8472955e778dfe2010cdd6a0"} Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.863669 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" podStartSLOduration=1.8636130450000001 podStartE2EDuration="1.863613045s" podCreationTimestamp="2025-11-29 01:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:30:01.862548779 +0000 UTC m=+1145.034698636" watchObservedRunningTime="2025-11-29 01:30:01.863613045 +0000 UTC m=+1145.035762902" Nov 29 01:30:01 crc kubenswrapper[4749]: I1129 01:30:01.881717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" event={"ID":"08e10646-6c79-42a1-8180-b2f7595e73ce","Type":"ContainerStarted","Data":"e1a7948218188202ba9fff3aaddf6403e153d484d6af3987920946932d13306a"} Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.691300 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:30:02 crc kubenswrapper[4749]: E1129 01:30:02.692389 4749 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 29 01:30:02 crc kubenswrapper[4749]: E1129 01:30:02.692530 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert podName:c707c92a-5aaa-40ca-a7ae-5ee5db538c3c nodeName:}" failed. No retries permitted until 2025-11-29 01:30:18.692508274 +0000 UTC m=+1161.864658131 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert") pod "infra-operator-controller-manager-57548d458d-l9m8x" (UID: "c707c92a-5aaa-40ca-a7ae-5ee5db538c3c") : secret "infra-operator-webhook-server-cert" not found Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.891810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" event={"ID":"14d6b00a-a750-4bc4-9d78-12dcefeafe6b","Type":"ContainerStarted","Data":"3698efc411747a3b2910a4f74cac0182093214c2ae73ed95cd62ee2e648d589b"} Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.892077 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" Nov 29 01:30:02 crc kubenswrapper[4749]: E1129 01:30:02.896402 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" podUID="14d6b00a-a750-4bc4-9d78-12dcefeafe6b" Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.909825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" event={"ID":"ae863e3f-87c3-4712-9e4d-5fcfa63df10b","Type":"ContainerStarted","Data":"fd924834da4387cb90822634e809418d1f98a54b5e87172c4feefac289e1c00b"} Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.909939 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" Nov 29 01:30:02 crc kubenswrapper[4749]: E1129 01:30:02.916533 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" podUID="ae863e3f-87c3-4712-9e4d-5fcfa63df10b" Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.926261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" event={"ID":"b2daf909-0247-4a43-a96a-a136e5268260","Type":"ContainerStarted","Data":"120780b0dcc02dc118ad897babd0ec89b45fb7a47122a09e9161465f09db3dbc"} Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.926428 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" Nov 29 01:30:02 crc kubenswrapper[4749]: E1129 01:30:02.927862 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" podUID="b2daf909-0247-4a43-a96a-a136e5268260" Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.946470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" event={"ID":"3bbc7cc2-0efd-4d6f-b424-d1558ed9f040","Type":"ContainerStarted","Data":"f74096c3c989e2a7f247206db4da3b5b51d8886fca2242c09cb5aa196ba72217"} Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.970602 4749 generic.go:334] "Generic (PLEG): container finished" podID="a3ad010e-7987-4781-8c42-3dbbb4006be8" containerID="0a06a14a4ee2ac82fb078478a0a978c83014068cbf32277336124049f0390eb6" exitCode=0 Nov 29 01:30:02 crc kubenswrapper[4749]: I1129 01:30:02.970665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" event={"ID":"a3ad010e-7987-4781-8c42-3dbbb4006be8","Type":"ContainerDied","Data":"0a06a14a4ee2ac82fb078478a0a978c83014068cbf32277336124049f0390eb6"} Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.314799 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.321375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/76c23b91-6df4-41e0-bcd3-eacc7e879aeb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m\" (UID: \"76c23b91-6df4-41e0-bcd3-eacc7e879aeb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.428852 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-s2nx6" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.436623 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.518337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.518396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.523570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-webhook-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.523876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/be884b00-8556-44c8-83e8-c851267b63e2-metrics-certs\") pod \"openstack-operator-controller-manager-6d7d7b9964-wws7p\" (UID: \"be884b00-8556-44c8-83e8-c851267b63e2\") " pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.601841 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-dtvvd" Nov 29 01:30:03 crc kubenswrapper[4749]: I1129 01:30:03.609673 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:30:03 crc kubenswrapper[4749]: E1129 01:30:03.984245 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" podUID="b2daf909-0247-4a43-a96a-a136e5268260" Nov 29 01:30:03 crc kubenswrapper[4749]: E1129 01:30:03.985677 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" podUID="ae863e3f-87c3-4712-9e4d-5fcfa63df10b" Nov 29 01:30:03 crc kubenswrapper[4749]: E1129 01:30:03.988552 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" podUID="14d6b00a-a750-4bc4-9d78-12dcefeafe6b" Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.124235 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.286352 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3ad010e-7987-4781-8c42-3dbbb4006be8-config-volume\") pod \"a3ad010e-7987-4781-8c42-3dbbb4006be8\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.286546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfh8w\" (UniqueName: \"kubernetes.io/projected/a3ad010e-7987-4781-8c42-3dbbb4006be8-kube-api-access-mfh8w\") pod \"a3ad010e-7987-4781-8c42-3dbbb4006be8\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.286584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3ad010e-7987-4781-8c42-3dbbb4006be8-secret-volume\") pod \"a3ad010e-7987-4781-8c42-3dbbb4006be8\" (UID: \"a3ad010e-7987-4781-8c42-3dbbb4006be8\") " Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.287359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3ad010e-7987-4781-8c42-3dbbb4006be8-config-volume" (OuterVolumeSpecName: "config-volume") pod "a3ad010e-7987-4781-8c42-3dbbb4006be8" (UID: "a3ad010e-7987-4781-8c42-3dbbb4006be8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.287583 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a3ad010e-7987-4781-8c42-3dbbb4006be8-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.307418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3ad010e-7987-4781-8c42-3dbbb4006be8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a3ad010e-7987-4781-8c42-3dbbb4006be8" (UID: "a3ad010e-7987-4781-8c42-3dbbb4006be8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.311840 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3ad010e-7987-4781-8c42-3dbbb4006be8-kube-api-access-mfh8w" (OuterVolumeSpecName: "kube-api-access-mfh8w") pod "a3ad010e-7987-4781-8c42-3dbbb4006be8" (UID: "a3ad010e-7987-4781-8c42-3dbbb4006be8"). InnerVolumeSpecName "kube-api-access-mfh8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.389764 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfh8w\" (UniqueName: \"kubernetes.io/projected/a3ad010e-7987-4781-8c42-3dbbb4006be8-kube-api-access-mfh8w\") on node \"crc\" DevicePath \"\"" Nov 29 01:30:05 crc kubenswrapper[4749]: I1129 01:30:05.389806 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a3ad010e-7987-4781-8c42-3dbbb4006be8-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:30:06 crc kubenswrapper[4749]: I1129 01:30:06.001985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" event={"ID":"a3ad010e-7987-4781-8c42-3dbbb4006be8","Type":"ContainerDied","Data":"82816c0672bc529eb514223e90ef9cd5fc1617a6c73030169d76fb53ba0ea936"} Nov 29 01:30:06 crc kubenswrapper[4749]: I1129 01:30:06.002044 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82816c0672bc529eb514223e90ef9cd5fc1617a6c73030169d76fb53ba0ea936" Nov 29 01:30:06 crc kubenswrapper[4749]: I1129 01:30:06.002060 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh" Nov 29 01:30:07 crc kubenswrapper[4749]: I1129 01:30:07.168889 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" Nov 29 01:30:07 crc kubenswrapper[4749]: E1129 01:30:07.172100 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" podUID="ae863e3f-87c3-4712-9e4d-5fcfa63df10b" Nov 29 01:30:07 crc kubenswrapper[4749]: I1129 01:30:07.301750 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" Nov 29 01:30:07 crc kubenswrapper[4749]: E1129 01:30:07.305242 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" podUID="b2daf909-0247-4a43-a96a-a136e5268260" Nov 29 01:30:07 crc kubenswrapper[4749]: I1129 01:30:07.773021 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" Nov 29 01:30:07 crc kubenswrapper[4749]: E1129 01:30:07.775234 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" podUID="14d6b00a-a750-4bc4-9d78-12dcefeafe6b" Nov 29 01:30:09 crc kubenswrapper[4749]: I1129 01:30:09.510966 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m"] Nov 29 01:30:09 crc kubenswrapper[4749]: I1129 01:30:09.659682 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p"] Nov 29 01:30:10 crc kubenswrapper[4749]: W1129 01:30:10.038150 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76c23b91_6df4_41e0_bcd3_eacc7e879aeb.slice/crio-12e6c44fea4008f22d182ff92be25e8160691413699aff47694a2a17ad7d98d8 WatchSource:0}: Error finding container 12e6c44fea4008f22d182ff92be25e8160691413699aff47694a2a17ad7d98d8: Status 404 returned error can't find the container with id 12e6c44fea4008f22d182ff92be25e8160691413699aff47694a2a17ad7d98d8 Nov 29 01:30:11 crc kubenswrapper[4749]: I1129 01:30:11.043134 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" event={"ID":"beb586a3-ac88-42b7-b080-8b68cb73bf53","Type":"ContainerStarted","Data":"ccd0f882742c2fe86b79ebfd8c9cef85f3873e7a80b4206aca62afb9216b2d33"} Nov 29 01:30:11 crc kubenswrapper[4749]: I1129 01:30:11.044870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" event={"ID":"be884b00-8556-44c8-83e8-c851267b63e2","Type":"ContainerStarted","Data":"081ce9548b3ccdf9dfd8508487fb9b64be3863e6e1844ee82f34c26dc2d1df48"} Nov 29 01:30:11 crc kubenswrapper[4749]: I1129 01:30:11.044931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" event={"ID":"be884b00-8556-44c8-83e8-c851267b63e2","Type":"ContainerStarted","Data":"356311f58bbf5391751416df5c7d3703996a12ba3ad6bfba22e52e3bfe50428c"} Nov 29 01:30:11 crc kubenswrapper[4749]: I1129 01:30:11.045025 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:30:11 crc kubenswrapper[4749]: I1129 01:30:11.046112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" event={"ID":"88cd9373-83ec-44e6-b108-04d0b853b5da","Type":"ContainerStarted","Data":"4033182f3cfe7230be1833cb0f3de6cbff0407c85c9f84b918ce47246c9c6859"} Nov 29 01:30:11 crc kubenswrapper[4749]: I1129 01:30:11.048630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" event={"ID":"76c23b91-6df4-41e0-bcd3-eacc7e879aeb","Type":"ContainerStarted","Data":"12e6c44fea4008f22d182ff92be25e8160691413699aff47694a2a17ad7d98d8"} Nov 29 01:30:11 crc kubenswrapper[4749]: I1129 01:30:11.076586 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" podStartSLOduration=24.076564765 podStartE2EDuration="24.076564765s" podCreationTimestamp="2025-11-29 01:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:30:11.072066014 +0000 UTC m=+1154.244215871" watchObservedRunningTime="2025-11-29 01:30:11.076564765 +0000 UTC m=+1154.248714622" Nov 29 01:30:12 crc kubenswrapper[4749]: I1129 01:30:12.059239 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" event={"ID":"913736f1-2790-4ac3-a478-58de73caee8f","Type":"ContainerStarted","Data":"802eb267dc3f1979d6f9724b0f16c04dd4fbb4edf3aa6cd5576ac455c3591cb1"} Nov 29 01:30:12 crc kubenswrapper[4749]: I1129 01:30:12.064874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" event={"ID":"b08fc4d5-cf16-49c3-b95d-e9175ab67846","Type":"ContainerStarted","Data":"1e31ffa445c2e74bb5383eb1808797438002d7d7aba136fe6ca0559e0fadc817"} Nov 29 01:30:12 crc kubenswrapper[4749]: I1129 01:30:12.067029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" event={"ID":"a493c4bc-b7d4-4e55-bc8f-205242be99eb","Type":"ContainerStarted","Data":"3e169610bb314464307f925e1577ac8dfb9988494ecac7df7b2d7a559bcb3b69"} Nov 29 01:30:18 crc kubenswrapper[4749]: I1129 01:30:18.696455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:30:18 crc kubenswrapper[4749]: I1129 01:30:18.707237 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c707c92a-5aaa-40ca-a7ae-5ee5db538c3c-cert\") pod \"infra-operator-controller-manager-57548d458d-l9m8x\" (UID: \"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:30:18 crc kubenswrapper[4749]: I1129 01:30:18.738803 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5zftd" Nov 29 01:30:18 crc kubenswrapper[4749]: I1129 01:30:18.745058 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:30:21 crc kubenswrapper[4749]: E1129 01:30:21.124485 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Nov 29 01:30:21 crc kubenswrapper[4749]: E1129 01:30:21.125191 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hs22h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-kcnjp_openstack-operators(872c8278-f904-4ef0-8180-46fd4beea0dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:30:21 crc kubenswrapper[4749]: E1129 01:30:21.126610 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" podUID="872c8278-f904-4ef0-8180-46fd4beea0dd" Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.181218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" event={"ID":"d0754b9d-ce96-4174-83f2-c4436e7d8195","Type":"ContainerStarted","Data":"4ce63aea7a93d0e97b524a60815ad1814f6bf1b65ee67f5e78737ab0c5a139f1"} Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.182367 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.185390 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.187841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" event={"ID":"3bbc7cc2-0efd-4d6f-b424-d1558ed9f040","Type":"ContainerStarted","Data":"b3ce6a48209a597309115937a48a96b416a762c1506ab3e79a10ff4492bb376b"} Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.188305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.190268 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.216291 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6546668bfd-f499g" podStartSLOduration=15.72983569 podStartE2EDuration="37.216165053s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.637153373 +0000 UTC m=+1131.809303230" lastFinishedPulling="2025-11-29 01:30:10.123482736 +0000 UTC m=+1153.295632593" observedRunningTime="2025-11-29 01:30:23.204650917 +0000 UTC m=+1166.376800784" watchObservedRunningTime="2025-11-29 01:30:23.216165053 +0000 UTC m=+1166.388314920" Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.258176 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-x7lmd" podStartSLOduration=15.08757514 podStartE2EDuration="37.258141332s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:47.966224131 +0000 UTC m=+1131.138373988" lastFinishedPulling="2025-11-29 01:30:10.136790323 +0000 UTC m=+1153.308940180" observedRunningTime="2025-11-29 01:30:23.244296329 +0000 UTC m=+1166.416446196" watchObservedRunningTime="2025-11-29 01:30:23.258141332 +0000 UTC m=+1166.430291189" Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.346498 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x"] Nov 29 01:30:23 crc kubenswrapper[4749]: I1129 01:30:23.620834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6d7d7b9964-wws7p" Nov 29 01:30:23 crc kubenswrapper[4749]: W1129 01:30:23.876108 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc707c92a_5aaa_40ca_a7ae_5ee5db538c3c.slice/crio-b1ab6e08247fac973e3d08d76f5b34d722507c5caf31da55db317afd664903e3 WatchSource:0}: Error finding container b1ab6e08247fac973e3d08d76f5b34d722507c5caf31da55db317afd664903e3: Status 404 returned error can't find the container with id b1ab6e08247fac973e3d08d76f5b34d722507c5caf31da55db317afd664903e3 Nov 29 01:30:24 crc kubenswrapper[4749]: I1129 01:30:24.206182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" event={"ID":"ae863e3f-87c3-4712-9e4d-5fcfa63df10b","Type":"ContainerStarted","Data":"6f465dfd23bcde3bf5573a05d9dd8100e36cf5f658b99815c14a4b02d4cb40b2"} Nov 29 01:30:24 crc kubenswrapper[4749]: I1129 01:30:24.209466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" event={"ID":"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c","Type":"ContainerStarted","Data":"b1ab6e08247fac973e3d08d76f5b34d722507c5caf31da55db317afd664903e3"} Nov 29 01:30:24 crc kubenswrapper[4749]: I1129 01:30:24.230620 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-s5vxs" podStartSLOduration=26.034561981 podStartE2EDuration="38.230590717s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.633757519 +0000 UTC m=+1131.805907376" lastFinishedPulling="2025-11-29 01:30:00.829786255 +0000 UTC m=+1144.001936112" observedRunningTime="2025-11-29 01:30:24.224722212 +0000 UTC m=+1167.396872069" watchObservedRunningTime="2025-11-29 01:30:24.230590717 +0000 UTC m=+1167.402740574" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.224459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" event={"ID":"88cd9373-83ec-44e6-b108-04d0b853b5da","Type":"ContainerStarted","Data":"8dbdb88a767ccdb72a5cc6fea4a5cd8fb459beeb3218f206d252064d2ac787e8"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.225914 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.230234 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.231548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" event={"ID":"9ed58501-79d8-4626-bd9f-dae8a95c872c","Type":"ContainerStarted","Data":"35c1fc48d3792def3541356eb0ccb60d2b6bf8bfe4f39102c6e82a74fe9ae10a"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.231869 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.234367 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.236379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" event={"ID":"2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f","Type":"ContainerStarted","Data":"e6dec6c90c0c9d316fa41a224eac5fa91e9fdf51ee9625cdc2eaa723d5a61e25"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.237095 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.238739 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.254505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" event={"ID":"0ba380f8-eaae-4987-add4-bdd6aa96f090","Type":"ContainerStarted","Data":"b2660367e135410a621b61c2386ff1978b9c92797a776282faadabda84a2dce3"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.255294 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.258629 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.259753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" event={"ID":"8609ba03-25af-49c9-b521-8c637dab5e91","Type":"ContainerStarted","Data":"8886bcaf186dd35f4c4328c68d56e30c16653139a9db8565f0f7bb7e6ba0debe"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.259784 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-6jvjq" podStartSLOduration=17.746837892 podStartE2EDuration="38.259757648s" podCreationTimestamp="2025-11-29 01:29:47 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.877877192 +0000 UTC m=+1132.050027049" lastFinishedPulling="2025-11-29 01:30:09.390796938 +0000 UTC m=+1152.562946805" observedRunningTime="2025-11-29 01:30:25.255097123 +0000 UTC m=+1168.427246990" watchObservedRunningTime="2025-11-29 01:30:25.259757648 +0000 UTC m=+1168.431907505" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.260351 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.262443 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.263130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" event={"ID":"ee9e0c71-281c-41b2-b566-c0222b456f23","Type":"ContainerStarted","Data":"b82da57039e29c741904c7404ed1d4f19b63f755407f440df54f935d989d4cab"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.263391 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.266315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.269435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" event={"ID":"a493c4bc-b7d4-4e55-bc8f-205242be99eb","Type":"ContainerStarted","Data":"f25508410ae76a937dcdb2341414d4472cc0a6e0bf825af45bedfcc66685eb61"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.269874 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.271843 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.286835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" event={"ID":"08e10646-6c79-42a1-8180-b2f7595e73ce","Type":"ContainerStarted","Data":"85965d5a332ee8fccc42a09e6d56c5a6272978bded17cadd9d6edc0b27e208fd"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.287863 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.294025 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.312343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" event={"ID":"913736f1-2790-4ac3-a478-58de73caee8f","Type":"ContainerStarted","Data":"aa6fa24174327a8b626a78c3009037ab6587f8f5fac916ca3aac34b3c667bc8c"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.314541 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.315270 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.320583 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-l5s92" podStartSLOduration=17.817246354 podStartE2EDuration="39.320557524s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.636299692 +0000 UTC m=+1131.808449549" lastFinishedPulling="2025-11-29 01:30:10.139610862 +0000 UTC m=+1153.311760719" observedRunningTime="2025-11-29 01:30:25.314893014 +0000 UTC m=+1168.487042861" watchObservedRunningTime="2025-11-29 01:30:25.320557524 +0000 UTC m=+1168.492707381" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.338731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" event={"ID":"b08fc4d5-cf16-49c3-b95d-e9175ab67846","Type":"ContainerStarted","Data":"73ddbc47fd42b4b88c0af613b30dee159cd226597db1476cdbcbffaa144671bf"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.341309 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.341655 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-7hch7" podStartSLOduration=5.008617327 podStartE2EDuration="39.341626706s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.643946809 +0000 UTC m=+1131.816096666" lastFinishedPulling="2025-11-29 01:30:22.976956178 +0000 UTC m=+1166.149106045" observedRunningTime="2025-11-29 01:30:25.339176295 +0000 UTC m=+1168.511326152" watchObservedRunningTime="2025-11-29 01:30:25.341626706 +0000 UTC m=+1168.513776553" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.346224 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.369296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" event={"ID":"14d6b00a-a750-4bc4-9d78-12dcefeafe6b","Type":"ContainerStarted","Data":"25150115bff8ca1a778165c87948e42a71bf1421b7595d42ac7a8b065d7250e3"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.376305 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.376375 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.376427 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.377188 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e6c7dec3b1649e653ef737530df27b983a2221104d91371e3560585b54c93a8"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.377264 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://6e6c7dec3b1649e653ef737530df27b983a2221104d91371e3560585b54c93a8" gracePeriod=600 Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.383574 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-ww9sb" podStartSLOduration=17.804859503 podStartE2EDuration="39.383536374s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.638971907 +0000 UTC m=+1131.811121764" lastFinishedPulling="2025-11-29 01:30:10.217648778 +0000 UTC m=+1153.389798635" observedRunningTime="2025-11-29 01:30:25.374445899 +0000 UTC m=+1168.546595766" watchObservedRunningTime="2025-11-29 01:30:25.383536374 +0000 UTC m=+1168.555686231" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.384504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" event={"ID":"76c23b91-6df4-41e0-bcd3-eacc7e879aeb","Type":"ContainerStarted","Data":"7b0530049b0b2028dfc2d0f8d5a5e5d8f6ecf4ede0723ba4c29a41ef8ce93779"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.384563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" event={"ID":"76c23b91-6df4-41e0-bcd3-eacc7e879aeb","Type":"ContainerStarted","Data":"3e971ac719adcabf795d170fd3e5f4c99a08ca8b12d5a0a0cc829d6cce1d0800"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.385523 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.391673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" event={"ID":"beb586a3-ac88-42b7-b080-8b68cb73bf53","Type":"ContainerStarted","Data":"a250fff517d5969b59e2f884d45adb702078a9f4a8298bfb298cdd143c4d99bd"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.392399 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.404261 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2kq6d" podStartSLOduration=18.228312194 podStartE2EDuration="39.404234797s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.889345504 +0000 UTC m=+1132.061495361" lastFinishedPulling="2025-11-29 01:30:10.065268107 +0000 UTC m=+1153.237417964" observedRunningTime="2025-11-29 01:30:25.395427469 +0000 UTC m=+1168.567577326" watchObservedRunningTime="2025-11-29 01:30:25.404234797 +0000 UTC m=+1168.576384654" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.413395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" event={"ID":"b2daf909-0247-4a43-a96a-a136e5268260","Type":"ContainerStarted","Data":"124cfa6145f5c9ad505918b2430198084bd8ffe7b8b69bd1d7a97616c4f685a6"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.414577 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.421367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" event={"ID":"2ad01dbb-582f-4074-a985-76067fc2bed3","Type":"ContainerStarted","Data":"17e6f265b300b866d0cac7f7b88987e926e88f6069d93102e17cbf30782236ff"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.421795 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.423917 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.439720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" event={"ID":"9e0d34d5-9c78-4c5b-8081-e076cde59208","Type":"ContainerStarted","Data":"3dcae54186a203b4fd45b803deaf2eabbf1f4c5258b112ca786ad3d48e116652"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.439819 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.444789 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-dm22j" podStartSLOduration=27.468930661999998 podStartE2EDuration="39.444762571s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.853772311 +0000 UTC m=+1132.025922168" lastFinishedPulling="2025-11-29 01:30:00.82960422 +0000 UTC m=+1144.001754077" observedRunningTime="2025-11-29 01:30:25.428923498 +0000 UTC m=+1168.601073365" watchObservedRunningTime="2025-11-29 01:30:25.444762571 +0000 UTC m=+1168.616912428" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.450952 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.456140 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" event={"ID":"3f250151-87d8-495b-895a-c43205c7b8ce","Type":"ContainerStarted","Data":"78c8234f1f386288802a436e1e89676384eb3b3d1169f8285a8f35fe70abbd5d"} Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.456479 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-56rwq" podStartSLOduration=5.171013682 podStartE2EDuration="39.45645784s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.633691728 +0000 UTC m=+1131.805841585" lastFinishedPulling="2025-11-29 01:30:22.919135876 +0000 UTC m=+1166.091285743" observedRunningTime="2025-11-29 01:30:25.455926097 +0000 UTC m=+1168.628075944" watchObservedRunningTime="2025-11-29 01:30:25.45645784 +0000 UTC m=+1168.628607697" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.457147 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.467095 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.531572 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-662nk" podStartSLOduration=5.252198452 podStartE2EDuration="39.53155066s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.636935987 +0000 UTC m=+1131.809085844" lastFinishedPulling="2025-11-29 01:30:22.916288185 +0000 UTC m=+1166.088438052" observedRunningTime="2025-11-29 01:30:25.487621902 +0000 UTC m=+1168.659771769" watchObservedRunningTime="2025-11-29 01:30:25.53155066 +0000 UTC m=+1168.703700517" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.537120 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fg7sg" podStartSLOduration=18.347184661 podStartE2EDuration="39.537110648s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.87532709 +0000 UTC m=+1132.047476947" lastFinishedPulling="2025-11-29 01:30:10.065253077 +0000 UTC m=+1153.237402934" observedRunningTime="2025-11-29 01:30:25.527440869 +0000 UTC m=+1168.699590726" watchObservedRunningTime="2025-11-29 01:30:25.537110648 +0000 UTC m=+1168.709260505" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.573265 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-6xbh8" podStartSLOduration=16.482599413 podStartE2EDuration="39.573231503s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.646451161 +0000 UTC m=+1131.818601008" lastFinishedPulling="2025-11-29 01:30:11.737083231 +0000 UTC m=+1154.909233098" observedRunningTime="2025-11-29 01:30:25.562585379 +0000 UTC m=+1168.734735246" watchObservedRunningTime="2025-11-29 01:30:25.573231503 +0000 UTC m=+1168.745381350" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.598129 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-ssmp5" podStartSLOduration=17.387150146 podStartE2EDuration="38.598101819s" podCreationTimestamp="2025-11-29 01:29:47 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.854332084 +0000 UTC m=+1132.026481931" lastFinishedPulling="2025-11-29 01:30:10.065283737 +0000 UTC m=+1153.237433604" observedRunningTime="2025-11-29 01:30:25.597378151 +0000 UTC m=+1168.769528008" watchObservedRunningTime="2025-11-29 01:30:25.598101819 +0000 UTC m=+1168.770251666" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.668509 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-ldkwb" podStartSLOduration=19.148979435 podStartE2EDuration="39.668483762s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.871297931 +0000 UTC m=+1132.043447788" lastFinishedPulling="2025-11-29 01:30:09.390802258 +0000 UTC m=+1152.562952115" observedRunningTime="2025-11-29 01:30:25.665135509 +0000 UTC m=+1168.837285386" watchObservedRunningTime="2025-11-29 01:30:25.668483762 +0000 UTC m=+1168.840633619" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.698518 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-546d4bdf48-5c65k" podStartSLOduration=27.445899481 podStartE2EDuration="39.698491035s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.636713252 +0000 UTC m=+1131.808863109" lastFinishedPulling="2025-11-29 01:30:00.889304806 +0000 UTC m=+1144.061454663" observedRunningTime="2025-11-29 01:30:25.695027199 +0000 UTC m=+1168.867177056" watchObservedRunningTime="2025-11-29 01:30:25.698491035 +0000 UTC m=+1168.870640892" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.740352 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-8lkfs" podStartSLOduration=5.584557508 podStartE2EDuration="39.740301911s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.801825155 +0000 UTC m=+1131.973975012" lastFinishedPulling="2025-11-29 01:30:22.957569548 +0000 UTC m=+1166.129719415" observedRunningTime="2025-11-29 01:30:25.722640923 +0000 UTC m=+1168.894790800" watchObservedRunningTime="2025-11-29 01:30:25.740301911 +0000 UTC m=+1168.912451768" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.770535 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-668d9c48b9-wm9q9" podStartSLOduration=7.663528216 podStartE2EDuration="39.770469058s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.251040713 +0000 UTC m=+1131.423190570" lastFinishedPulling="2025-11-29 01:30:20.357981515 +0000 UTC m=+1163.530131412" observedRunningTime="2025-11-29 01:30:25.763950397 +0000 UTC m=+1168.936100254" watchObservedRunningTime="2025-11-29 01:30:25.770469058 +0000 UTC m=+1168.942618915" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.840147 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" podStartSLOduration=25.874072139 podStartE2EDuration="39.840115613s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:30:10.064123159 +0000 UTC m=+1153.236273016" lastFinishedPulling="2025-11-29 01:30:24.030166633 +0000 UTC m=+1167.202316490" observedRunningTime="2025-11-29 01:30:25.8270575 +0000 UTC m=+1168.999207347" watchObservedRunningTime="2025-11-29 01:30:25.840115613 +0000 UTC m=+1169.012265470" Nov 29 01:30:25 crc kubenswrapper[4749]: I1129 01:30:25.859099 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-5jr7h" podStartSLOduration=17.949119788 podStartE2EDuration="39.859059962s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.229411692 +0000 UTC m=+1131.401561549" lastFinishedPulling="2025-11-29 01:30:10.139351876 +0000 UTC m=+1153.311501723" observedRunningTime="2025-11-29 01:30:25.847054165 +0000 UTC m=+1169.019204032" watchObservedRunningTime="2025-11-29 01:30:25.859059962 +0000 UTC m=+1169.031209819" Nov 29 01:30:26 crc kubenswrapper[4749]: I1129 01:30:26.464616 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="6e6c7dec3b1649e653ef737530df27b983a2221104d91371e3560585b54c93a8" exitCode=0 Nov 29 01:30:26 crc kubenswrapper[4749]: I1129 01:30:26.466331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"6e6c7dec3b1649e653ef737530df27b983a2221104d91371e3560585b54c93a8"} Nov 29 01:30:26 crc kubenswrapper[4749]: I1129 01:30:26.466377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"78a46fc2167fe8c7a63102b9ca82268fb546ed6ba88ac6008b9b767e900c3b97"} Nov 29 01:30:26 crc kubenswrapper[4749]: I1129 01:30:26.466396 4749 scope.go:117] "RemoveContainer" containerID="9302b61a72148487837a4aeb2ccc5c42240573bc5890594b41af31b7f42617b2" Nov 29 01:30:27 crc kubenswrapper[4749]: I1129 01:30:27.476492 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" event={"ID":"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c","Type":"ContainerStarted","Data":"fb01309d9f419149269b93c541ed5ed25c94a8f148851fd5cdb241a51d4557e8"} Nov 29 01:30:27 crc kubenswrapper[4749]: I1129 01:30:27.477079 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:30:27 crc kubenswrapper[4749]: I1129 01:30:27.477099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" event={"ID":"c707c92a-5aaa-40ca-a7ae-5ee5db538c3c","Type":"ContainerStarted","Data":"ecfffcb70fbc3117297b9a907ed0b26f1a6c18b9ae7bdf1cc1081491563b2c7e"} Nov 29 01:30:27 crc kubenswrapper[4749]: I1129 01:30:27.504903 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" podStartSLOduration=38.411053406 podStartE2EDuration="41.504870436s" podCreationTimestamp="2025-11-29 01:29:46 +0000 UTC" firstStartedPulling="2025-11-29 01:30:23.911589906 +0000 UTC m=+1167.083739783" lastFinishedPulling="2025-11-29 01:30:27.005406956 +0000 UTC m=+1170.177556813" observedRunningTime="2025-11-29 01:30:27.495981515 +0000 UTC m=+1170.668131392" watchObservedRunningTime="2025-11-29 01:30:27.504870436 +0000 UTC m=+1170.677020303" Nov 29 01:30:32 crc kubenswrapper[4749]: E1129 01:30:32.079922 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" podUID="872c8278-f904-4ef0-8180-46fd4beea0dd" Nov 29 01:30:33 crc kubenswrapper[4749]: I1129 01:30:33.452321 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m" Nov 29 01:30:38 crc kubenswrapper[4749]: I1129 01:30:38.754645 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-l9m8x" Nov 29 01:30:46 crc kubenswrapper[4749]: I1129 01:30:46.732920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" event={"ID":"872c8278-f904-4ef0-8180-46fd4beea0dd","Type":"ContainerStarted","Data":"9716389c219ca267ab6dbfae2b170c8fd566dad9a8d26391c541a56ddcd6afb3"} Nov 29 01:30:46 crc kubenswrapper[4749]: I1129 01:30:46.767763 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-kcnjp" podStartSLOduration=3.133734799 podStartE2EDuration="59.767550918s" podCreationTimestamp="2025-11-29 01:29:47 +0000 UTC" firstStartedPulling="2025-11-29 01:29:48.873181827 +0000 UTC m=+1132.045331684" lastFinishedPulling="2025-11-29 01:30:45.506997916 +0000 UTC m=+1188.679147803" observedRunningTime="2025-11-29 01:30:46.757845787 +0000 UTC m=+1189.929995704" watchObservedRunningTime="2025-11-29 01:30:46.767550918 +0000 UTC m=+1189.939700815" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.466981 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mvmwb"] Nov 29 01:31:02 crc kubenswrapper[4749]: E1129 01:31:02.471059 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3ad010e-7987-4781-8c42-3dbbb4006be8" containerName="collect-profiles" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.471108 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3ad010e-7987-4781-8c42-3dbbb4006be8" containerName="collect-profiles" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.471349 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3ad010e-7987-4781-8c42-3dbbb4006be8" containerName="collect-profiles" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.472525 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.481287 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.481282 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.481785 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.482696 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-h57g6" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.483434 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mvmwb"] Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.542841 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vvpnp"] Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.544558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.547626 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.570402 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vvpnp"] Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.643133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4w29\" (UniqueName: \"kubernetes.io/projected/91a21aa9-44e7-4c6b-b520-8ce97c04f437-kube-api-access-x4w29\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.643215 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.643290 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-config\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.643313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9q2\" (UniqueName: \"kubernetes.io/projected/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-kube-api-access-km9q2\") pod \"dnsmasq-dns-675f4bcbfc-mvmwb\" (UID: \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.643335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-config\") pod \"dnsmasq-dns-675f4bcbfc-mvmwb\" (UID: \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.744997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4w29\" (UniqueName: \"kubernetes.io/projected/91a21aa9-44e7-4c6b-b520-8ce97c04f437-kube-api-access-x4w29\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.745517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.745725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-config\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.745973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km9q2\" (UniqueName: \"kubernetes.io/projected/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-kube-api-access-km9q2\") pod \"dnsmasq-dns-675f4bcbfc-mvmwb\" (UID: \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.746470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-config\") pod \"dnsmasq-dns-675f4bcbfc-mvmwb\" (UID: \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.746786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.747142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-config\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.747591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-config\") pod \"dnsmasq-dns-675f4bcbfc-mvmwb\" (UID: \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.769966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9q2\" (UniqueName: \"kubernetes.io/projected/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-kube-api-access-km9q2\") pod \"dnsmasq-dns-675f4bcbfc-mvmwb\" (UID: \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.772334 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4w29\" (UniqueName: \"kubernetes.io/projected/91a21aa9-44e7-4c6b-b520-8ce97c04f437-kube-api-access-x4w29\") pod \"dnsmasq-dns-78dd6ddcc-vvpnp\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.803604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:02 crc kubenswrapper[4749]: I1129 01:31:02.862855 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:03 crc kubenswrapper[4749]: I1129 01:31:03.145529 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vvpnp"] Nov 29 01:31:03 crc kubenswrapper[4749]: I1129 01:31:03.285676 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mvmwb"] Nov 29 01:31:03 crc kubenswrapper[4749]: W1129 01:31:03.288880 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fcde7d2_d6e7_4386_89a6_0d0516cfa4b4.slice/crio-5ca1369df73db20830620c6e87b8801863995df80980e1f818f72de3a2f50e4b WatchSource:0}: Error finding container 5ca1369df73db20830620c6e87b8801863995df80980e1f818f72de3a2f50e4b: Status 404 returned error can't find the container with id 5ca1369df73db20830620c6e87b8801863995df80980e1f818f72de3a2f50e4b Nov 29 01:31:03 crc kubenswrapper[4749]: I1129 01:31:03.942299 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" event={"ID":"91a21aa9-44e7-4c6b-b520-8ce97c04f437","Type":"ContainerStarted","Data":"63fff0f8a6437dd92ade54640617bcd3e018fb6aaa0cbb39d6e207e6248e4697"} Nov 29 01:31:03 crc kubenswrapper[4749]: I1129 01:31:03.944669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" event={"ID":"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4","Type":"ContainerStarted","Data":"5ca1369df73db20830620c6e87b8801863995df80980e1f818f72de3a2f50e4b"} Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.749573 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mvmwb"] Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.777493 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gh8rv"] Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.779243 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.798769 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gh8rv"] Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.893173 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwmz\" (UniqueName: \"kubernetes.io/projected/3d6a2b30-fdb2-4552-b574-0843137e497b-kube-api-access-2wwmz\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.893247 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-config\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.893379 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.994836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwmz\" (UniqueName: \"kubernetes.io/projected/3d6a2b30-fdb2-4552-b574-0843137e497b-kube-api-access-2wwmz\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.994885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-config\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.994974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.997617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-dns-svc\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:04 crc kubenswrapper[4749]: I1129 01:31:04.998467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-config\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.034551 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vvpnp"] Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.035172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwmz\" (UniqueName: \"kubernetes.io/projected/3d6a2b30-fdb2-4552-b574-0843137e497b-kube-api-access-2wwmz\") pod \"dnsmasq-dns-666b6646f7-gh8rv\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.074282 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-z8x8p"] Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.081560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.103141 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.106590 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-z8x8p"] Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.212152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-config\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.212260 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.212668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgp8b\" (UniqueName: \"kubernetes.io/projected/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-kube-api-access-mgp8b\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.314554 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgp8b\" (UniqueName: \"kubernetes.io/projected/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-kube-api-access-mgp8b\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.314642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-config\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.314669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.315984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-config\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.316358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.369483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgp8b\" (UniqueName: \"kubernetes.io/projected/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-kube-api-access-mgp8b\") pod \"dnsmasq-dns-57d769cc4f-z8x8p\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.418718 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.921801 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.923615 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.926960 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.927028 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.927159 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.927224 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.926961 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-q6m5t" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.928384 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.930466 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.937010 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.988931 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-z8x8p"] Nov 29 01:31:05 crc kubenswrapper[4749]: W1129 01:31:05.989521 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eb31b93_8853_4eb4_a2b5_e8dec5c1c55e.slice/crio-641d3051f50184ccac9891693914441b4bdc85d1bb17e5b9a316309b0f59df84 WatchSource:0}: Error finding container 641d3051f50184ccac9891693914441b4bdc85d1bb17e5b9a316309b0f59df84: Status 404 returned error can't find the container with id 641d3051f50184ccac9891693914441b4bdc85d1bb17e5b9a316309b0f59df84 Nov 29 01:31:05 crc kubenswrapper[4749]: W1129 01:31:05.992902 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d6a2b30_fdb2_4552_b574_0843137e497b.slice/crio-54bb9950ec5f4b505c25bc2a77f36f8daccdf282fcbd17d632c3af4fc0b6d357 WatchSource:0}: Error finding container 54bb9950ec5f4b505c25bc2a77f36f8daccdf282fcbd17d632c3af4fc0b6d357: Status 404 returned error can't find the container with id 54bb9950ec5f4b505c25bc2a77f36f8daccdf282fcbd17d632c3af4fc0b6d357 Nov 29 01:31:05 crc kubenswrapper[4749]: I1129 01:31:05.996793 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gh8rv"] Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.036158 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.036301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-server-conf\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.036339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.036368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.036408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.036527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31a44203-fd94-4eb4-952f-d54a5c577095-pod-info\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.036869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.037036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4sk4\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-kube-api-access-s4sk4\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.037089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.037186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.037222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31a44203-fd94-4eb4-952f-d54a5c577095-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.139344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4sk4\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-kube-api-access-s4sk4\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.139427 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.139501 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.139521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31a44203-fd94-4eb4-952f-d54a5c577095-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.140482 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.141496 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.141661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-server-conf\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.141673 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.141763 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.141791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.141861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.141910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31a44203-fd94-4eb4-952f-d54a5c577095-pod-info\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.142008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.142494 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.142524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.143076 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-server-conf\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.143179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.148471 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31a44203-fd94-4eb4-952f-d54a5c577095-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.149094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.149150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31a44203-fd94-4eb4-952f-d54a5c577095-pod-info\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.149165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.158968 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4sk4\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-kube-api-access-s4sk4\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.161091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.223050 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.230809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.234658 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.235154 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.235456 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.237162 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.237525 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.237738 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wt6c8" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.237847 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.238300 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.258611 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9a9603fe-72d8-479a-86be-9b914455fba1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345231 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345251 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345273 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345327 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzld\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-kube-api-access-lqzld\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9a9603fe-72d8-479a-86be-9b914455fba1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.345803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447279 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9a9603fe-72d8-479a-86be-9b914455fba1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzld\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-kube-api-access-lqzld\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.447986 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.448007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9a9603fe-72d8-479a-86be-9b914455fba1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.448521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.449798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.449841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.450968 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.451360 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.453269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.456947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.456945 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9a9603fe-72d8-479a-86be-9b914455fba1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.457050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9a9603fe-72d8-479a-86be-9b914455fba1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.458677 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.478440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzld\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-kube-api-access-lqzld\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.505380 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.570042 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:31:06 crc kubenswrapper[4749]: I1129 01:31:06.826053 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 01:31:06 crc kubenswrapper[4749]: W1129 01:31:06.846269 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31a44203_fd94_4eb4_952f_d54a5c577095.slice/crio-9f0a445b3e24931289c0920fe22beb7a96551b7c1eb02e57c99c4c323e31fa6c WatchSource:0}: Error finding container 9f0a445b3e24931289c0920fe22beb7a96551b7c1eb02e57c99c4c323e31fa6c: Status 404 returned error can't find the container with id 9f0a445b3e24931289c0920fe22beb7a96551b7c1eb02e57c99c4c323e31fa6c Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.024412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" event={"ID":"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e","Type":"ContainerStarted","Data":"641d3051f50184ccac9891693914441b4bdc85d1bb17e5b9a316309b0f59df84"} Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.038571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31a44203-fd94-4eb4-952f-d54a5c577095","Type":"ContainerStarted","Data":"9f0a445b3e24931289c0920fe22beb7a96551b7c1eb02e57c99c4c323e31fa6c"} Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.053692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" event={"ID":"3d6a2b30-fdb2-4552-b574-0843137e497b","Type":"ContainerStarted","Data":"54bb9950ec5f4b505c25bc2a77f36f8daccdf282fcbd17d632c3af4fc0b6d357"} Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.134439 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.552547 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.555085 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.558463 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.563444 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.563952 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.564034 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.563967 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-twcjs" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.567373 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.689313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.689499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.689602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.689719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.689756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kolla-config\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.689829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-default\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.689913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.689993 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bl48\" (UniqueName: \"kubernetes.io/projected/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kube-api-access-7bl48\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.817973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bl48\" (UniqueName: \"kubernetes.io/projected/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kube-api-access-7bl48\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.818418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.819409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.819450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.821243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.821479 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-operator-scripts\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.821636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-generated\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.821967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kolla-config\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.822104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-default\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.822191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.822542 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.822870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kolla-config\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.823475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-default\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.841157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.842353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bl48\" (UniqueName: \"kubernetes.io/projected/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kube-api-access-7bl48\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.849071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.874476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " pod="openstack/openstack-galera-0" Nov 29 01:31:07 crc kubenswrapper[4749]: I1129 01:31:07.880563 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.092667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a9603fe-72d8-479a-86be-9b914455fba1","Type":"ContainerStarted","Data":"5767aeef5b80eb49702430521e1e47fa69ca9b4cbf2afc25a9131ede7d597978"} Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.403339 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 01:31:08 crc kubenswrapper[4749]: W1129 01:31:08.419459 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf05059ec_0cc5_4873_8041_bb14c2fa4c53.slice/crio-45788b64bbc41910f81bca78a3a12bf0287f755b49c0a7d3fdac15570d7d0c18 WatchSource:0}: Error finding container 45788b64bbc41910f81bca78a3a12bf0287f755b49c0a7d3fdac15570d7d0c18: Status 404 returned error can't find the container with id 45788b64bbc41910f81bca78a3a12bf0287f755b49c0a7d3fdac15570d7d0c18 Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.895549 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.915407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.915541 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.920506 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-s78sc" Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.921029 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.921120 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 29 01:31:08 crc kubenswrapper[4749]: I1129 01:31:08.921398 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.048471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.048542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.048576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.048610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.048761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx4jh\" (UniqueName: \"kubernetes.io/projected/5906d408-1c10-4c55-a07b-f94d302a08c6-kube-api-access-tx4jh\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.048890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.048925 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.048961 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.117996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f05059ec-0cc5-4873-8041-bb14c2fa4c53","Type":"ContainerStarted","Data":"45788b64bbc41910f81bca78a3a12bf0287f755b49c0a7d3fdac15570d7d0c18"} Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.157934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx4jh\" (UniqueName: \"kubernetes.io/projected/5906d408-1c10-4c55-a07b-f94d302a08c6-kube-api-access-tx4jh\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.158057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.158094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.158128 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.158170 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.158215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.158229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.158253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.158785 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.160078 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.160265 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.160552 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.163336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.165960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.167176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.181996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx4jh\" (UniqueName: \"kubernetes.io/projected/5906d408-1c10-4c55-a07b-f94d302a08c6-kube-api-access-tx4jh\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.196073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.243966 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.265808 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.269373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.277073 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.277223 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.277517 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6hlcq" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.281769 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.368470 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-config-data\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.368566 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.368640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhml\" (UniqueName: \"kubernetes.io/projected/3962a4be-25eb-45f6-8b1a-f84341319df3-kube-api-access-kjhml\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.368685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.368707 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-kolla-config\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.469690 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhml\" (UniqueName: \"kubernetes.io/projected/3962a4be-25eb-45f6-8b1a-f84341319df3-kube-api-access-kjhml\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.469778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.469807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-kolla-config\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.469844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-config-data\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.469884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.471504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-config-data\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.472310 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-kolla-config\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.481865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.499516 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.499761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhml\" (UniqueName: \"kubernetes.io/projected/3962a4be-25eb-45f6-8b1a-f84341319df3-kube-api-access-kjhml\") pod \"memcached-0\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " pod="openstack/memcached-0" Nov 29 01:31:09 crc kubenswrapper[4749]: I1129 01:31:09.604019 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 01:31:11 crc kubenswrapper[4749]: I1129 01:31:11.204174 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:31:11 crc kubenswrapper[4749]: I1129 01:31:11.222747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:31:11 crc kubenswrapper[4749]: I1129 01:31:11.222862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 01:31:11 crc kubenswrapper[4749]: I1129 01:31:11.231518 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nxbtm" Nov 29 01:31:11 crc kubenswrapper[4749]: I1129 01:31:11.314535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phv28\" (UniqueName: \"kubernetes.io/projected/f479785d-0431-4aaf-88b6-ad9000996a52-kube-api-access-phv28\") pod \"kube-state-metrics-0\" (UID: \"f479785d-0431-4aaf-88b6-ad9000996a52\") " pod="openstack/kube-state-metrics-0" Nov 29 01:31:11 crc kubenswrapper[4749]: I1129 01:31:11.419506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phv28\" (UniqueName: \"kubernetes.io/projected/f479785d-0431-4aaf-88b6-ad9000996a52-kube-api-access-phv28\") pod \"kube-state-metrics-0\" (UID: \"f479785d-0431-4aaf-88b6-ad9000996a52\") " pod="openstack/kube-state-metrics-0" Nov 29 01:31:11 crc kubenswrapper[4749]: I1129 01:31:11.448936 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phv28\" (UniqueName: \"kubernetes.io/projected/f479785d-0431-4aaf-88b6-ad9000996a52-kube-api-access-phv28\") pod \"kube-state-metrics-0\" (UID: \"f479785d-0431-4aaf-88b6-ad9000996a52\") " pod="openstack/kube-state-metrics-0" Nov 29 01:31:11 crc kubenswrapper[4749]: I1129 01:31:11.595791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.529737 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9lxvg"] Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.531875 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.538152 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zljnz" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.540787 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.543591 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2k6f9"] Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.544896 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.545741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.555724 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9lxvg"] Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.573517 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2k6f9"] Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.692563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-etc-ovs\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.692635 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-lib\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.692706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-log-ovn\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.692872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc7dq\" (UniqueName: \"kubernetes.io/projected/c7f827fe-55e8-4f5d-a074-bd79f9029382-kube-api-access-pc7dq\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.692937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-log\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.692995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89sp\" (UniqueName: \"kubernetes.io/projected/65fd8520-689b-4f93-850e-bac0cec97025-kube-api-access-w89sp\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.693033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-run\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.693099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f827fe-55e8-4f5d-a074-bd79f9029382-scripts\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.693132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run-ovn\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.693392 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65fd8520-689b-4f93-850e-bac0cec97025-scripts\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.693461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-ovn-controller-tls-certs\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.693588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-combined-ca-bundle\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.693632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-etc-ovs\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-lib\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-log-ovn\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc7dq\" (UniqueName: \"kubernetes.io/projected/c7f827fe-55e8-4f5d-a074-bd79f9029382-kube-api-access-pc7dq\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-log\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795759 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w89sp\" (UniqueName: \"kubernetes.io/projected/65fd8520-689b-4f93-850e-bac0cec97025-kube-api-access-w89sp\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-run\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795810 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f827fe-55e8-4f5d-a074-bd79f9029382-scripts\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run-ovn\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65fd8520-689b-4f93-850e-bac0cec97025-scripts\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795894 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-ovn-controller-tls-certs\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-combined-ca-bundle\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.795955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.797024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-etc-ovs\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.797231 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-log-ovn\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.797096 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-lib\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.798017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-log\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.798102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run-ovn\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.800384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65fd8520-689b-4f93-850e-bac0cec97025-scripts\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.801439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f827fe-55e8-4f5d-a074-bd79f9029382-scripts\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.801571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.801605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-run\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.807438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-ovn-controller-tls-certs\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.809940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-combined-ca-bundle\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.814187 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc7dq\" (UniqueName: \"kubernetes.io/projected/c7f827fe-55e8-4f5d-a074-bd79f9029382-kube-api-access-pc7dq\") pod \"ovn-controller-ovs-2k6f9\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.816161 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89sp\" (UniqueName: \"kubernetes.io/projected/65fd8520-689b-4f93-850e-bac0cec97025-kube-api-access-w89sp\") pod \"ovn-controller-9lxvg\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.870766 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:14 crc kubenswrapper[4749]: I1129 01:31:14.877177 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.190558 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.194465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.202303 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.221603 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.221920 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.226043 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-69z2s" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.226672 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.244946 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.329999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.330060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4q7f\" (UniqueName: \"kubernetes.io/projected/4a087f05-8b7d-4207-88e8-1c622d57c653-kube-api-access-d4q7f\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.330143 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.330174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.330211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.330265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-config\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.331416 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.331514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.433234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.433298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.433361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-config\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.433391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.433407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.433435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.433453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4q7f\" (UniqueName: \"kubernetes.io/projected/4a087f05-8b7d-4207-88e8-1c622d57c653-kube-api-access-d4q7f\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.433529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.434012 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.434908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.435299 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-config\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.435341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.441255 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.445015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.450646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.456307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4q7f\" (UniqueName: \"kubernetes.io/projected/4a087f05-8b7d-4207-88e8-1c622d57c653-kube-api-access-d4q7f\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.457087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:15 crc kubenswrapper[4749]: I1129 01:31:15.585620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.812787 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.816115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.819977 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.821181 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.821354 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.823719 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-568gv" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.833238 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.908516 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.908582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d619b935-7717-4e88-af76-97e946d3cef5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.908650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.908683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.908768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxbvc\" (UniqueName: \"kubernetes.io/projected/d619b935-7717-4e88-af76-97e946d3cef5-kube-api-access-sxbvc\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.908819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.908883 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-config\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:18 crc kubenswrapper[4749]: I1129 01:31:18.908909 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.010330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.010389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d619b935-7717-4e88-af76-97e946d3cef5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.010468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.010499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.010527 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxbvc\" (UniqueName: \"kubernetes.io/projected/d619b935-7717-4e88-af76-97e946d3cef5-kube-api-access-sxbvc\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.010581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.010649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-config\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.010677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.011447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d619b935-7717-4e88-af76-97e946d3cef5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.011433 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.013525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-config\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.013523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.021554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.022639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.023466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.047116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxbvc\" (UniqueName: \"kubernetes.io/projected/d619b935-7717-4e88-af76-97e946d3cef5-kube-api-access-sxbvc\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.050859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:19 crc kubenswrapper[4749]: I1129 01:31:19.148117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:25 crc kubenswrapper[4749]: E1129 01:31:25.355267 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Nov 29 01:31:25 crc kubenswrapper[4749]: E1129 01:31:25.357239 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4sk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(31a44203-fd94-4eb4-952f-d54a5c577095): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:31:25 crc kubenswrapper[4749]: E1129 01:31:25.357490 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Nov 29 01:31:25 crc kubenswrapper[4749]: E1129 01:31:25.357637 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lqzld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(9a9603fe-72d8-479a-86be-9b914455fba1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:31:25 crc kubenswrapper[4749]: E1129 01:31:25.358846 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" Nov 29 01:31:25 crc kubenswrapper[4749]: E1129 01:31:25.359669 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.071279 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.072004 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-km9q2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-mvmwb_openstack(3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.073398 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" podUID="3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.087829 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.088126 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mgp8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-z8x8p_openstack(7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.090282 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.325024 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.325342 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" Nov 29 01:31:26 crc kubenswrapper[4749]: E1129 01:31:26.325416 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" Nov 29 01:31:28 crc kubenswrapper[4749]: E1129 01:31:28.252301 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 29 01:31:28 crc kubenswrapper[4749]: E1129 01:31:28.253994 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4w29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-vvpnp_openstack(91a21aa9-44e7-4c6b-b520-8ce97c04f437): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:31:28 crc kubenswrapper[4749]: E1129 01:31:28.255450 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 29 01:31:28 crc kubenswrapper[4749]: E1129 01:31:28.255485 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" podUID="91a21aa9-44e7-4c6b-b520-8ce97c04f437" Nov 29 01:31:28 crc kubenswrapper[4749]: E1129 01:31:28.259291 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wwmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-gh8rv_openstack(3d6a2b30-fdb2-4552-b574-0843137e497b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:31:28 crc kubenswrapper[4749]: E1129 01:31:28.260495 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.328552 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.352916 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.353112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mvmwb" event={"ID":"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4","Type":"ContainerDied","Data":"5ca1369df73db20830620c6e87b8801863995df80980e1f818f72de3a2f50e4b"} Nov 29 01:31:28 crc kubenswrapper[4749]: E1129 01:31:28.354642 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.415508 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-config\") pod \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\" (UID: \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\") " Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.415649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km9q2\" (UniqueName: \"kubernetes.io/projected/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-kube-api-access-km9q2\") pod \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\" (UID: \"3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4\") " Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.416637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-config" (OuterVolumeSpecName: "config") pod "3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4" (UID: "3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.431334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-kube-api-access-km9q2" (OuterVolumeSpecName: "kube-api-access-km9q2") pod "3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4" (UID: "3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4"). InnerVolumeSpecName "kube-api-access-km9q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.518181 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.518580 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km9q2\" (UniqueName: \"kubernetes.io/projected/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4-kube-api-access-km9q2\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.715444 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mvmwb"] Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.721786 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mvmwb"] Nov 29 01:31:28 crc kubenswrapper[4749]: I1129 01:31:28.947913 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.029091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4w29\" (UniqueName: \"kubernetes.io/projected/91a21aa9-44e7-4c6b-b520-8ce97c04f437-kube-api-access-x4w29\") pod \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.029340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-dns-svc\") pod \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.030183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91a21aa9-44e7-4c6b-b520-8ce97c04f437" (UID: "91a21aa9-44e7-4c6b-b520-8ce97c04f437"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.030299 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-config\") pod \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\" (UID: \"91a21aa9-44e7-4c6b-b520-8ce97c04f437\") " Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.030896 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.031541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-config" (OuterVolumeSpecName: "config") pod "91a21aa9-44e7-4c6b-b520-8ce97c04f437" (UID: "91a21aa9-44e7-4c6b-b520-8ce97c04f437"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.040164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a21aa9-44e7-4c6b-b520-8ce97c04f437-kube-api-access-x4w29" (OuterVolumeSpecName: "kube-api-access-x4w29") pod "91a21aa9-44e7-4c6b-b520-8ce97c04f437" (UID: "91a21aa9-44e7-4c6b-b520-8ce97c04f437"). InnerVolumeSpecName "kube-api-access-x4w29". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.044176 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.063640 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.101022 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4" path="/var/lib/kubelet/pods/3fcde7d2-d6e7-4386-89a6-0d0516cfa4b4/volumes" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.101889 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.133390 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91a21aa9-44e7-4c6b-b520-8ce97c04f437-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.133428 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4w29\" (UniqueName: \"kubernetes.io/projected/91a21aa9-44e7-4c6b-b520-8ce97c04f437-kube-api-access-x4w29\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.194383 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9lxvg"] Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.214493 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2k6f9"] Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.307347 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 01:31:29 crc kubenswrapper[4749]: W1129 01:31:29.311482 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd619b935_7717_4e88_af76_97e946d3cef5.slice/crio-0b1fc67c499e90d71161ea778582adf3bde479858063b4d20a377e392fe03f73 WatchSource:0}: Error finding container 0b1fc67c499e90d71161ea778582adf3bde479858063b4d20a377e392fe03f73: Status 404 returned error can't find the container with id 0b1fc67c499e90d71161ea778582adf3bde479858063b4d20a377e392fe03f73 Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.362370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d619b935-7717-4e88-af76-97e946d3cef5","Type":"ContainerStarted","Data":"0b1fc67c499e90d71161ea778582adf3bde479858063b4d20a377e392fe03f73"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.364524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3962a4be-25eb-45f6-8b1a-f84341319df3","Type":"ContainerStarted","Data":"d7a6822b2bd25f98130f4ad5d9e194fdde848d8e6de8b38846c9ae549b197ff8"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.366180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" event={"ID":"91a21aa9-44e7-4c6b-b520-8ce97c04f437","Type":"ContainerDied","Data":"63fff0f8a6437dd92ade54640617bcd3e018fb6aaa0cbb39d6e207e6248e4697"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.366229 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vvpnp" Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.367794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg" event={"ID":"65fd8520-689b-4f93-850e-bac0cec97025","Type":"ContainerStarted","Data":"0d9a67beb6e686c0fc7eeb05bdbaea75575fe1c7b9a8e4037d3f79f6feccf60f"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.370673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f479785d-0431-4aaf-88b6-ad9000996a52","Type":"ContainerStarted","Data":"80cbc49420a383a93584e75de307dda909a303a4cabdf309bbad574f8fe5b6f9"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.372115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2k6f9" event={"ID":"c7f827fe-55e8-4f5d-a074-bd79f9029382","Type":"ContainerStarted","Data":"51f0a3d911779c645288199d5d3988fea18cd238d365c97137ef25a953555fbd"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.374084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5906d408-1c10-4c55-a07b-f94d302a08c6","Type":"ContainerStarted","Data":"d10c8e8eafcd0f30d5b2663c5e1152a35d0cef673fe747a79ecbeee6e1f7e5c1"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.374128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5906d408-1c10-4c55-a07b-f94d302a08c6","Type":"ContainerStarted","Data":"2532e9289f06b649b0623532b9fa6e8c2f9c88941885cc1d45dc17aee1e33643"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.375465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f05059ec-0cc5-4873-8041-bb14c2fa4c53","Type":"ContainerStarted","Data":"fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20"} Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.416808 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vvpnp"] Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.425602 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vvpnp"] Nov 29 01:31:29 crc kubenswrapper[4749]: I1129 01:31:29.874112 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 01:31:30 crc kubenswrapper[4749]: W1129 01:31:30.182901 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a087f05_8b7d_4207_88e8_1c622d57c653.slice/crio-a6ff43f195e2ec7ab096b323215b989009c6e18373a4a264d1fcf19c85ccc8bd WatchSource:0}: Error finding container a6ff43f195e2ec7ab096b323215b989009c6e18373a4a264d1fcf19c85ccc8bd: Status 404 returned error can't find the container with id a6ff43f195e2ec7ab096b323215b989009c6e18373a4a264d1fcf19c85ccc8bd Nov 29 01:31:30 crc kubenswrapper[4749]: I1129 01:31:30.389450 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a087f05-8b7d-4207-88e8-1c622d57c653","Type":"ContainerStarted","Data":"a6ff43f195e2ec7ab096b323215b989009c6e18373a4a264d1fcf19c85ccc8bd"} Nov 29 01:31:31 crc kubenswrapper[4749]: I1129 01:31:31.087229 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a21aa9-44e7-4c6b-b520-8ce97c04f437" path="/var/lib/kubelet/pods/91a21aa9-44e7-4c6b-b520-8ce97c04f437/volumes" Nov 29 01:31:32 crc kubenswrapper[4749]: I1129 01:31:32.415289 4749 generic.go:334] "Generic (PLEG): container finished" podID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" containerID="fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20" exitCode=0 Nov 29 01:31:32 crc kubenswrapper[4749]: I1129 01:31:32.415347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f05059ec-0cc5-4873-8041-bb14c2fa4c53","Type":"ContainerDied","Data":"fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20"} Nov 29 01:31:33 crc kubenswrapper[4749]: I1129 01:31:33.427474 4749 generic.go:334] "Generic (PLEG): container finished" podID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerID="d10c8e8eafcd0f30d5b2663c5e1152a35d0cef673fe747a79ecbeee6e1f7e5c1" exitCode=0 Nov 29 01:31:33 crc kubenswrapper[4749]: I1129 01:31:33.427526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5906d408-1c10-4c55-a07b-f94d302a08c6","Type":"ContainerDied","Data":"d10c8e8eafcd0f30d5b2663c5e1152a35d0cef673fe747a79ecbeee6e1f7e5c1"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.444526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f479785d-0431-4aaf-88b6-ad9000996a52","Type":"ContainerStarted","Data":"e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.445268 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.448052 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerID="b40e601df6ea3022f7b2ce3ff2a8ce5c29d60d4fcad9dbdc9f6bcfff8e5e9b2c" exitCode=0 Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.448152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2k6f9" event={"ID":"c7f827fe-55e8-4f5d-a074-bd79f9029382","Type":"ContainerDied","Data":"b40e601df6ea3022f7b2ce3ff2a8ce5c29d60d4fcad9dbdc9f6bcfff8e5e9b2c"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.451023 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5906d408-1c10-4c55-a07b-f94d302a08c6","Type":"ContainerStarted","Data":"caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.455685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f05059ec-0cc5-4873-8041-bb14c2fa4c53","Type":"ContainerStarted","Data":"30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.457707 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d619b935-7717-4e88-af76-97e946d3cef5","Type":"ContainerStarted","Data":"3dc57790e819b5c2fd7d24841a617fcfe2253270800c29923a0f8a3796f1a9a5"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.460733 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.350179051 podStartE2EDuration="24.46071386s" podCreationTimestamp="2025-11-29 01:31:11 +0000 UTC" firstStartedPulling="2025-11-29 01:31:29.092963482 +0000 UTC m=+1232.265113339" lastFinishedPulling="2025-11-29 01:31:34.203498291 +0000 UTC m=+1237.375648148" observedRunningTime="2025-11-29 01:31:35.458270169 +0000 UTC m=+1238.630420036" watchObservedRunningTime="2025-11-29 01:31:35.46071386 +0000 UTC m=+1238.632863727" Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.476010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3962a4be-25eb-45f6-8b1a-f84341319df3","Type":"ContainerStarted","Data":"d8dc0505ade6c3a7b001fd49ab3118193dd6af3d47615b2a7ee7ceba6fef3ffc"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.476183 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.478541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a087f05-8b7d-4207-88e8-1c622d57c653","Type":"ContainerStarted","Data":"54b419e178792015e0c2b8fac863bb01960334b0e71784b51c9825ddfc510195"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.480725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg" event={"ID":"65fd8520-689b-4f93-850e-bac0cec97025","Type":"ContainerStarted","Data":"da6648012c83a6cc74d32fcd946afbf2e43e17fe059785d2a3c1297c4a85a109"} Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.481002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9lxvg" Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.496394 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.496369243 podStartE2EDuration="28.496369243s" podCreationTimestamp="2025-11-29 01:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:31:35.48657519 +0000 UTC m=+1238.658725077" watchObservedRunningTime="2025-11-29 01:31:35.496369243 +0000 UTC m=+1238.668519110" Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.537897 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.705528511 podStartE2EDuration="29.537881291s" podCreationTimestamp="2025-11-29 01:31:06 +0000 UTC" firstStartedPulling="2025-11-29 01:31:08.429604689 +0000 UTC m=+1211.601754546" lastFinishedPulling="2025-11-29 01:31:28.261957469 +0000 UTC m=+1231.434107326" observedRunningTime="2025-11-29 01:31:35.534794825 +0000 UTC m=+1238.706944682" watchObservedRunningTime="2025-11-29 01:31:35.537881291 +0000 UTC m=+1238.710031148" Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.559292 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9lxvg" podStartSLOduration=16.687947986 podStartE2EDuration="21.559275641s" podCreationTimestamp="2025-11-29 01:31:14 +0000 UTC" firstStartedPulling="2025-11-29 01:31:29.215466076 +0000 UTC m=+1232.387615933" lastFinishedPulling="2025-11-29 01:31:34.086793691 +0000 UTC m=+1237.258943588" observedRunningTime="2025-11-29 01:31:35.554932653 +0000 UTC m=+1238.727082510" watchObservedRunningTime="2025-11-29 01:31:35.559275641 +0000 UTC m=+1238.731425498" Nov 29 01:31:35 crc kubenswrapper[4749]: I1129 01:31:35.573407 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=22.846865742 podStartE2EDuration="26.573388081s" podCreationTimestamp="2025-11-29 01:31:09 +0000 UTC" firstStartedPulling="2025-11-29 01:31:29.053632648 +0000 UTC m=+1232.225782505" lastFinishedPulling="2025-11-29 01:31:32.780154957 +0000 UTC m=+1235.952304844" observedRunningTime="2025-11-29 01:31:35.573066393 +0000 UTC m=+1238.745216250" watchObservedRunningTime="2025-11-29 01:31:35.573388081 +0000 UTC m=+1238.745537938" Nov 29 01:31:36 crc kubenswrapper[4749]: I1129 01:31:36.493299 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2k6f9" event={"ID":"c7f827fe-55e8-4f5d-a074-bd79f9029382","Type":"ContainerStarted","Data":"cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952"} Nov 29 01:31:36 crc kubenswrapper[4749]: I1129 01:31:36.494137 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2k6f9" event={"ID":"c7f827fe-55e8-4f5d-a074-bd79f9029382","Type":"ContainerStarted","Data":"6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9"} Nov 29 01:31:36 crc kubenswrapper[4749]: I1129 01:31:36.521817 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2k6f9" podStartSLOduration=18.072488979 podStartE2EDuration="22.521796121s" podCreationTimestamp="2025-11-29 01:31:14 +0000 UTC" firstStartedPulling="2025-11-29 01:31:29.221520686 +0000 UTC m=+1232.393670543" lastFinishedPulling="2025-11-29 01:31:33.670827788 +0000 UTC m=+1236.842977685" observedRunningTime="2025-11-29 01:31:36.521670828 +0000 UTC m=+1239.693820695" watchObservedRunningTime="2025-11-29 01:31:36.521796121 +0000 UTC m=+1239.693945988" Nov 29 01:31:37 crc kubenswrapper[4749]: I1129 01:31:37.514298 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:37 crc kubenswrapper[4749]: I1129 01:31:37.514878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:31:37 crc kubenswrapper[4749]: I1129 01:31:37.881546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 29 01:31:37 crc kubenswrapper[4749]: I1129 01:31:37.881953 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 29 01:31:38 crc kubenswrapper[4749]: I1129 01:31:38.523925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d619b935-7717-4e88-af76-97e946d3cef5","Type":"ContainerStarted","Data":"d0569384a350f4295e192fcee3f98b3e3e7b0e1e279219472a2ca0815c60cdf2"} Nov 29 01:31:38 crc kubenswrapper[4749]: I1129 01:31:38.528119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a087f05-8b7d-4207-88e8-1c622d57c653","Type":"ContainerStarted","Data":"6e9657e12a54da3db09ff22f9ac8c9eb8a67c2424dfe7eae2c12bb076d0ac942"} Nov 29 01:31:38 crc kubenswrapper[4749]: I1129 01:31:38.554661 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.441842501 podStartE2EDuration="21.554644912s" podCreationTimestamp="2025-11-29 01:31:17 +0000 UTC" firstStartedPulling="2025-11-29 01:31:29.313943125 +0000 UTC m=+1232.486092972" lastFinishedPulling="2025-11-29 01:31:37.426745526 +0000 UTC m=+1240.598895383" observedRunningTime="2025-11-29 01:31:38.549048253 +0000 UTC m=+1241.721198120" watchObservedRunningTime="2025-11-29 01:31:38.554644912 +0000 UTC m=+1241.726794779" Nov 29 01:31:38 crc kubenswrapper[4749]: I1129 01:31:38.575889 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.326692238 podStartE2EDuration="24.575868368s" podCreationTimestamp="2025-11-29 01:31:14 +0000 UTC" firstStartedPulling="2025-11-29 01:31:30.185976094 +0000 UTC m=+1233.358125951" lastFinishedPulling="2025-11-29 01:31:37.435152224 +0000 UTC m=+1240.607302081" observedRunningTime="2025-11-29 01:31:38.575662062 +0000 UTC m=+1241.747811959" watchObservedRunningTime="2025-11-29 01:31:38.575868368 +0000 UTC m=+1241.748018235" Nov 29 01:31:39 crc kubenswrapper[4749]: I1129 01:31:39.148980 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:39 crc kubenswrapper[4749]: I1129 01:31:39.246504 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:39 crc kubenswrapper[4749]: I1129 01:31:39.246642 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:39 crc kubenswrapper[4749]: I1129 01:31:39.540007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31a44203-fd94-4eb4-952f-d54a5c577095","Type":"ContainerStarted","Data":"b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e"} Nov 29 01:31:39 crc kubenswrapper[4749]: I1129 01:31:39.590392 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:39 crc kubenswrapper[4749]: I1129 01:31:39.609372 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 29 01:31:39 crc kubenswrapper[4749]: I1129 01:31:39.656469 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.207011 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.207975 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.280228 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.326772 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.547732 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d6a2b30-fdb2-4552-b574-0843137e497b" containerID="020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb" exitCode=0 Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.547794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" event={"ID":"3d6a2b30-fdb2-4552-b574-0843137e497b","Type":"ContainerDied","Data":"020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb"} Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.551155 4749 generic.go:334] "Generic (PLEG): container finished" podID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" containerID="5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e" exitCode=0 Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.551273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" event={"ID":"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e","Type":"ContainerDied","Data":"5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e"} Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.551604 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.609694 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.612454 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.769686 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-z8x8p"] Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.801373 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-p9qf8"] Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.802752 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.806249 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.817901 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-p9qf8"] Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.921350 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.921412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twq7\" (UniqueName: \"kubernetes.io/projected/9b26ba14-29ee-4c24-9781-31921928562a-kube-api-access-7twq7\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.921436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.921518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-config\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.949420 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6zwjh"] Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.950390 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.953469 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.964441 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6zwjh"] Nov 29 01:31:40 crc kubenswrapper[4749]: I1129 01:31:40.979015 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gh8rv"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.024233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-config\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.024352 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.024387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twq7\" (UniqueName: \"kubernetes.io/projected/9b26ba14-29ee-4c24-9781-31921928562a-kube-api-access-7twq7\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.024411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.025394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.025920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-config\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.029562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.056061 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hgc22"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.057370 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.059912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twq7\" (UniqueName: \"kubernetes.io/projected/9b26ba14-29ee-4c24-9781-31921928562a-kube-api-access-7twq7\") pod \"dnsmasq-dns-7f896c8c65-p9qf8\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.070469 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hgc22"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.073355 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.105694 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.107073 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.107155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.110949 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-chcbk" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.110993 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.111173 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.111232 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.120121 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.125705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovs-rundir\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.125853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-combined-ca-bundle\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.125895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-config\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.125940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.125964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbklg\" (UniqueName: \"kubernetes.io/projected/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-kube-api-access-zbklg\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.126011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovn-rundir\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.228190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovs-rundir\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.228805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-config\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.228834 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-scripts\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.228944 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovs-rundir\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229032 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg4qv\" (UniqueName: \"kubernetes.io/projected/c639d859-841e-4f38-a2b3-09fc3201e616-kube-api-access-sg4qv\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229311 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229373 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229404 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-config\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229436 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4q64\" (UniqueName: \"kubernetes.io/projected/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-kube-api-access-p4q64\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229460 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-combined-ca-bundle\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-config\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbklg\" (UniqueName: \"kubernetes.io/projected/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-kube-api-access-zbklg\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229593 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovn-rundir\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.229695 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovn-rundir\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.230943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-config\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.237316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-combined-ca-bundle\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.237385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.258321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbklg\" (UniqueName: \"kubernetes.io/projected/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-kube-api-access-zbklg\") pod \"ovn-controller-metrics-6zwjh\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.271154 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg4qv\" (UniqueName: \"kubernetes.io/projected/c639d859-841e-4f38-a2b3-09fc3201e616-kube-api-access-sg4qv\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-config\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4q64\" (UniqueName: \"kubernetes.io/projected/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-kube-api-access-p4q64\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-config\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-scripts\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.333903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.336739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.336812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.338697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-config\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.340070 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.340705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-scripts\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.350036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.354085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg4qv\" (UniqueName: \"kubernetes.io/projected/c639d859-841e-4f38-a2b3-09fc3201e616-kube-api-access-sg4qv\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.370299 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.436616 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.570935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.572476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.576619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-config\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.586189 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4q64\" (UniqueName: \"kubernetes.io/projected/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-kube-api-access-p4q64\") pod \"dnsmasq-dns-86db49b7ff-hgc22\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.588835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" event={"ID":"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e","Type":"ContainerStarted","Data":"031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599"} Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.638484 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.645704 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-p9qf8"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.667945 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-p9qf8"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.687756 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.713847 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-5c7gw"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.715592 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.720597 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5c7gw"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.789789 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6zwjh"] Nov 29 01:31:41 crc kubenswrapper[4749]: W1129 01:31:41.803818 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e2ea248_43ad_4851_96ea_3e6adaba3ef0.slice/crio-68f5326f39d1f34d4dae49a13444765c6df1393fec50100f7257dd359a5608a9 WatchSource:0}: Error finding container 68f5326f39d1f34d4dae49a13444765c6df1393fec50100f7257dd359a5608a9: Status 404 returned error can't find the container with id 68f5326f39d1f34d4dae49a13444765c6df1393fec50100f7257dd359a5608a9 Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.846656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnphs\" (UniqueName: \"kubernetes.io/projected/cf32a260-17ee-431a-ab31-9b2215b6823f-kube-api-access-cnphs\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.847107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-config\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.847149 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-dns-svc\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.847236 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.850647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.933769 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.957296 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.957424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnphs\" (UniqueName: \"kubernetes.io/projected/cf32a260-17ee-431a-ab31-9b2215b6823f-kube-api-access-cnphs\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.957503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-config\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.957543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-dns-svc\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.957625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.958717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.959171 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-config\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.959857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-dns-svc\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.962317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:41 crc kubenswrapper[4749]: I1129 01:31:41.979374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnphs\" (UniqueName: \"kubernetes.io/projected/cf32a260-17ee-431a-ab31-9b2215b6823f-kube-api-access-cnphs\") pod \"dnsmasq-dns-698758b865-5c7gw\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.061974 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.359695 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hgc22"] Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.615165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a9603fe-72d8-479a-86be-9b914455fba1","Type":"ContainerStarted","Data":"d15407c83f3311f8c3958f6e4bd3c9da53a4bea36613017566ab26cfa4a60437"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.621044 4749 generic.go:334] "Generic (PLEG): container finished" podID="9b26ba14-29ee-4c24-9781-31921928562a" containerID="925552f3d845c3cd90d5e7e28621b50d90fb63adc636f5514c7002726c4f5a81" exitCode=0 Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.621176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" event={"ID":"9b26ba14-29ee-4c24-9781-31921928562a","Type":"ContainerDied","Data":"925552f3d845c3cd90d5e7e28621b50d90fb63adc636f5514c7002726c4f5a81"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.621245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" event={"ID":"9b26ba14-29ee-4c24-9781-31921928562a","Type":"ContainerStarted","Data":"9761e3507b0066056ba0df612ae96e40313f38c9d02f515cdd310f376e3e2144"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.625007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c639d859-841e-4f38-a2b3-09fc3201e616","Type":"ContainerStarted","Data":"de49a7f8016c592329b857f9d4f7dfc7e1181f8e388cac615fdbefdcd54b5aab"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.631306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" event={"ID":"3d6a2b30-fdb2-4552-b574-0843137e497b","Type":"ContainerStarted","Data":"18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.631433 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" containerName="dnsmasq-dns" containerID="cri-o://18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19" gracePeriod=10 Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.631664 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.637715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6zwjh" event={"ID":"9e2ea248-43ad-4851-96ea-3e6adaba3ef0","Type":"ContainerStarted","Data":"23d0cb96cf04c81af3bc6f2003ca35fb96d25512a2dad19539c520ac601aba5c"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.637753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6zwjh" event={"ID":"9e2ea248-43ad-4851-96ea-3e6adaba3ef0","Type":"ContainerStarted","Data":"68f5326f39d1f34d4dae49a13444765c6df1393fec50100f7257dd359a5608a9"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.642580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" event={"ID":"6d6b6ec0-3dec-4d20-919f-128342cb1eb1","Type":"ContainerStarted","Data":"11f7ead3ae7a578e0a2ea81d3114b0e794f04b33101b661640147fb015d3433e"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.642635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" event={"ID":"6d6b6ec0-3dec-4d20-919f-128342cb1eb1","Type":"ContainerStarted","Data":"d38299b311ff2f680d22e894e59ac93bc6aa723270c1526c988fc0941a443f58"} Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.643287 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" containerName="dnsmasq-dns" containerID="cri-o://031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599" gracePeriod=10 Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.643304 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.652410 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5c7gw"] Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.679172 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" podStartSLOduration=3.407874041 podStartE2EDuration="37.679150778s" podCreationTimestamp="2025-11-29 01:31:05 +0000 UTC" firstStartedPulling="2025-11-29 01:31:05.99264383 +0000 UTC m=+1209.164793687" lastFinishedPulling="2025-11-29 01:31:40.263920567 +0000 UTC m=+1243.436070424" observedRunningTime="2025-11-29 01:31:42.676495442 +0000 UTC m=+1245.848645299" watchObservedRunningTime="2025-11-29 01:31:42.679150778 +0000 UTC m=+1245.851300635" Nov 29 01:31:42 crc kubenswrapper[4749]: W1129 01:31:42.700469 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf32a260_17ee_431a_ab31_9b2215b6823f.slice/crio-d52dd243062c67132014b6929531d8ddad7560fa9432236524994f7ebb91ff6a WatchSource:0}: Error finding container d52dd243062c67132014b6929531d8ddad7560fa9432236524994f7ebb91ff6a: Status 404 returned error can't find the container with id d52dd243062c67132014b6929531d8ddad7560fa9432236524994f7ebb91ff6a Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.709015 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6zwjh" podStartSLOduration=2.708993887 podStartE2EDuration="2.708993887s" podCreationTimestamp="2025-11-29 01:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:31:42.700755613 +0000 UTC m=+1245.872905470" watchObservedRunningTime="2025-11-29 01:31:42.708993887 +0000 UTC m=+1245.881143744" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.780058 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" podStartSLOduration=-9223371998.074745 podStartE2EDuration="38.780031357s" podCreationTimestamp="2025-11-29 01:31:04 +0000 UTC" firstStartedPulling="2025-11-29 01:31:05.99991205 +0000 UTC m=+1209.172061907" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:31:42.765268261 +0000 UTC m=+1245.937418128" watchObservedRunningTime="2025-11-29 01:31:42.780031357 +0000 UTC m=+1245.952181214" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.881776 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.894344 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.896921 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lxncd" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.897185 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.897344 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.908804 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.922144 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.984256 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.984440 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.984635 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt7ns\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-kube-api-access-zt7ns\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.984721 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-lock\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:42 crc kubenswrapper[4749]: I1129 01:31:42.984889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-cache\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.012150 4749 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 29 01:31:43 crc kubenswrapper[4749]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/6d6b6ec0-3dec-4d20-919f-128342cb1eb1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 29 01:31:43 crc kubenswrapper[4749]: > podSandboxID="d38299b311ff2f680d22e894e59ac93bc6aa723270c1526c988fc0941a443f58" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.012324 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 29 01:31:43 crc kubenswrapper[4749]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4q64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-hgc22_openstack(6d6b6ec0-3dec-4d20-919f-128342cb1eb1): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/6d6b6ec0-3dec-4d20-919f-128342cb1eb1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 29 01:31:43 crc kubenswrapper[4749]: > logger="UnhandledError" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.013515 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/6d6b6ec0-3dec-4d20-919f-128342cb1eb1/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.086403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-cache\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.086478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.086511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.086588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt7ns\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-kube-api-access-zt7ns\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.086626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-lock\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.087344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-lock\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.087678 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.087905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-cache\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.088461 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.088484 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.088523 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift podName:9faf9cff-1e0a-4d87-b75e-8899450678a4 nodeName:}" failed. No retries permitted until 2025-11-29 01:31:43.588508427 +0000 UTC m=+1246.760658274 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift") pod "swift-storage-0" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4") : configmap "swift-ring-files" not found Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.107837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt7ns\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-kube-api-access-zt7ns\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.111269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.423700 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.510458 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.513845 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.534857 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.593809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-dns-svc\") pod \"9b26ba14-29ee-4c24-9781-31921928562a\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.593900 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-config\") pod \"3d6a2b30-fdb2-4552-b574-0843137e497b\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.593919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-ovsdbserver-sb\") pod \"9b26ba14-29ee-4c24-9781-31921928562a\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.593945 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wwmz\" (UniqueName: \"kubernetes.io/projected/3d6a2b30-fdb2-4552-b574-0843137e497b-kube-api-access-2wwmz\") pod \"3d6a2b30-fdb2-4552-b574-0843137e497b\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.594568 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-config\") pod \"9b26ba14-29ee-4c24-9781-31921928562a\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.594627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-dns-svc\") pod \"3d6a2b30-fdb2-4552-b574-0843137e497b\" (UID: \"3d6a2b30-fdb2-4552-b574-0843137e497b\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.594666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7twq7\" (UniqueName: \"kubernetes.io/projected/9b26ba14-29ee-4c24-9781-31921928562a-kube-api-access-7twq7\") pod \"9b26ba14-29ee-4c24-9781-31921928562a\" (UID: \"9b26ba14-29ee-4c24-9781-31921928562a\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.594950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.595316 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.595334 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.595400 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift podName:9faf9cff-1e0a-4d87-b75e-8899450678a4 nodeName:}" failed. No retries permitted until 2025-11-29 01:31:44.595363351 +0000 UTC m=+1247.767513208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift") pod "swift-storage-0" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4") : configmap "swift-ring-files" not found Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.604620 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6a2b30-fdb2-4552-b574-0843137e497b-kube-api-access-2wwmz" (OuterVolumeSpecName: "kube-api-access-2wwmz") pod "3d6a2b30-fdb2-4552-b574-0843137e497b" (UID: "3d6a2b30-fdb2-4552-b574-0843137e497b"). InnerVolumeSpecName "kube-api-access-2wwmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.607174 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b26ba14-29ee-4c24-9781-31921928562a-kube-api-access-7twq7" (OuterVolumeSpecName: "kube-api-access-7twq7") pod "9b26ba14-29ee-4c24-9781-31921928562a" (UID: "9b26ba14-29ee-4c24-9781-31921928562a"). InnerVolumeSpecName "kube-api-access-7twq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.608526 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.625091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b26ba14-29ee-4c24-9781-31921928562a" (UID: "9b26ba14-29ee-4c24-9781-31921928562a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.636353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-config" (OuterVolumeSpecName: "config") pod "9b26ba14-29ee-4c24-9781-31921928562a" (UID: "9b26ba14-29ee-4c24-9781-31921928562a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.649546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b26ba14-29ee-4c24-9781-31921928562a" (UID: "9b26ba14-29ee-4c24-9781-31921928562a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.667163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d6a2b30-fdb2-4552-b574-0843137e497b" (UID: "3d6a2b30-fdb2-4552-b574-0843137e497b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.674116 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d6a2b30-fdb2-4552-b574-0843137e497b" containerID="18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19" exitCode=0 Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.674175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" event={"ID":"3d6a2b30-fdb2-4552-b574-0843137e497b","Type":"ContainerDied","Data":"18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19"} Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.674217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" event={"ID":"3d6a2b30-fdb2-4552-b574-0843137e497b","Type":"ContainerDied","Data":"54bb9950ec5f4b505c25bc2a77f36f8daccdf282fcbd17d632c3af4fc0b6d357"} Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.674235 4749 scope.go:117] "RemoveContainer" containerID="18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.674341 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-gh8rv" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.683167 4749 generic.go:334] "Generic (PLEG): container finished" podID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" containerID="031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599" exitCode=0 Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.683281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" event={"ID":"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e","Type":"ContainerDied","Data":"031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599"} Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.683311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" event={"ID":"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e","Type":"ContainerDied","Data":"641d3051f50184ccac9891693914441b4bdc85d1bb17e5b9a316309b0f59df84"} Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.683317 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-z8x8p" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.691377 4749 generic.go:334] "Generic (PLEG): container finished" podID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerID="11f7ead3ae7a578e0a2ea81d3114b0e794f04b33101b661640147fb015d3433e" exitCode=0 Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.691470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" event={"ID":"6d6b6ec0-3dec-4d20-919f-128342cb1eb1","Type":"ContainerDied","Data":"11f7ead3ae7a578e0a2ea81d3114b0e794f04b33101b661640147fb015d3433e"} Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.693630 4749 generic.go:334] "Generic (PLEG): container finished" podID="cf32a260-17ee-431a-ab31-9b2215b6823f" containerID="2aef9b804a300d8b097b904359b47ca8d68cf5bd5cb50f9e4a509ec03501f312" exitCode=0 Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.693679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5c7gw" event={"ID":"cf32a260-17ee-431a-ab31-9b2215b6823f","Type":"ContainerDied","Data":"2aef9b804a300d8b097b904359b47ca8d68cf5bd5cb50f9e4a509ec03501f312"} Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.693696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5c7gw" event={"ID":"cf32a260-17ee-431a-ab31-9b2215b6823f","Type":"ContainerStarted","Data":"d52dd243062c67132014b6929531d8ddad7560fa9432236524994f7ebb91ff6a"} Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.695571 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-dns-svc\") pod \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.695726 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-config\") pod \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.695769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgp8b\" (UniqueName: \"kubernetes.io/projected/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-kube-api-access-mgp8b\") pod \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\" (UID: \"7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e\") " Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.696110 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7twq7\" (UniqueName: \"kubernetes.io/projected/9b26ba14-29ee-4c24-9781-31921928562a-kube-api-access-7twq7\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.696130 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.696141 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.696150 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wwmz\" (UniqueName: \"kubernetes.io/projected/3d6a2b30-fdb2-4552-b574-0843137e497b-kube-api-access-2wwmz\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.696161 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b26ba14-29ee-4c24-9781-31921928562a-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.696172 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.699121 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-kube-api-access-mgp8b" (OuterVolumeSpecName: "kube-api-access-mgp8b") pod "7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" (UID: "7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e"). InnerVolumeSpecName "kube-api-access-mgp8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.701297 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-config" (OuterVolumeSpecName: "config") pod "3d6a2b30-fdb2-4552-b574-0843137e497b" (UID: "3d6a2b30-fdb2-4552-b574-0843137e497b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.703967 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.705420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-p9qf8" event={"ID":"9b26ba14-29ee-4c24-9781-31921928562a","Type":"ContainerDied","Data":"9761e3507b0066056ba0df612ae96e40313f38c9d02f515cdd310f376e3e2144"} Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.716007 4749 scope.go:117] "RemoveContainer" containerID="020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.746388 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" (UID: "7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.763220 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-config" (OuterVolumeSpecName: "config") pod "7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" (UID: "7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.775157 4749 scope.go:117] "RemoveContainer" containerID="18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.775782 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19\": container with ID starting with 18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19 not found: ID does not exist" containerID="18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.775828 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19"} err="failed to get container status \"18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19\": rpc error: code = NotFound desc = could not find container \"18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19\": container with ID starting with 18b8895690e5cc6d79725867f5b872e3baa0c3e0747f26dc25594bae115ebb19 not found: ID does not exist" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.775855 4749 scope.go:117] "RemoveContainer" containerID="020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.777285 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb\": container with ID starting with 020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb not found: ID does not exist" containerID="020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.777310 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb"} err="failed to get container status \"020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb\": rpc error: code = NotFound desc = could not find container \"020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb\": container with ID starting with 020dc5febabd9e407b7a9f12da089c0f61fa5cecc82e158561f245e0c1a0b5eb not found: ID does not exist" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.777326 4749 scope.go:117] "RemoveContainer" containerID="031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.795744 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-p9qf8"] Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.798020 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.798043 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d6a2b30-fdb2-4552-b574-0843137e497b-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.798053 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.798064 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgp8b\" (UniqueName: \"kubernetes.io/projected/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e-kube-api-access-mgp8b\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.819434 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-p9qf8"] Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.826445 4749 scope.go:117] "RemoveContainer" containerID="5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.859656 4749 scope.go:117] "RemoveContainer" containerID="031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.860259 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599\": container with ID starting with 031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599 not found: ID does not exist" containerID="031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.860328 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599"} err="failed to get container status \"031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599\": rpc error: code = NotFound desc = could not find container \"031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599\": container with ID starting with 031f6e9a727aba412c2af7a860d2e931197c79cf2170391f155372da22f54599 not found: ID does not exist" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.860364 4749 scope.go:117] "RemoveContainer" containerID="5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e" Nov 29 01:31:43 crc kubenswrapper[4749]: E1129 01:31:43.860955 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e\": container with ID starting with 5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e not found: ID does not exist" containerID="5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.861007 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e"} err="failed to get container status \"5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e\": rpc error: code = NotFound desc = could not find container \"5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e\": container with ID starting with 5d2d9cb43979fb2cb26abbc8bae246f45c5cd5078b8aa6303ce1022ca3d9fd7e not found: ID does not exist" Nov 29 01:31:43 crc kubenswrapper[4749]: I1129 01:31:43.861040 4749 scope.go:117] "RemoveContainer" containerID="925552f3d845c3cd90d5e7e28621b50d90fb63adc636f5514c7002726c4f5a81" Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.033152 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gh8rv"] Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.072528 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-gh8rv"] Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.088340 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-z8x8p"] Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.098584 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-z8x8p"] Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.612323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:44 crc kubenswrapper[4749]: E1129 01:31:44.612569 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 01:31:44 crc kubenswrapper[4749]: E1129 01:31:44.612596 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 01:31:44 crc kubenswrapper[4749]: E1129 01:31:44.612661 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift podName:9faf9cff-1e0a-4d87-b75e-8899450678a4 nodeName:}" failed. No retries permitted until 2025-11-29 01:31:46.612639008 +0000 UTC m=+1249.784788895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift") pod "swift-storage-0" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4") : configmap "swift-ring-files" not found Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.718129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c639d859-841e-4f38-a2b3-09fc3201e616","Type":"ContainerStarted","Data":"54a83438d867dacb43afd491866da3b4e42fc03fc4d37a58d1393a675b29c458"} Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.718561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c639d859-841e-4f38-a2b3-09fc3201e616","Type":"ContainerStarted","Data":"32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27"} Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.718598 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.723679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" event={"ID":"6d6b6ec0-3dec-4d20-919f-128342cb1eb1","Type":"ContainerStarted","Data":"99755c868763f972128f21ca651af421ac1e461e7cfbe7a329adf4cc6dd1628d"} Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.723924 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.734427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5c7gw" event={"ID":"cf32a260-17ee-431a-ab31-9b2215b6823f","Type":"ContainerStarted","Data":"d387c2987e2dc3792bc71d9ecfbf2e6a56b4c44b79c8cc37de7229402aa58215"} Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.734926 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.756968 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.333676291 podStartE2EDuration="3.756942982s" podCreationTimestamp="2025-11-29 01:31:41 +0000 UTC" firstStartedPulling="2025-11-29 01:31:41.978382442 +0000 UTC m=+1245.150532299" lastFinishedPulling="2025-11-29 01:31:43.401649133 +0000 UTC m=+1246.573798990" observedRunningTime="2025-11-29 01:31:44.747232051 +0000 UTC m=+1247.919381908" watchObservedRunningTime="2025-11-29 01:31:44.756942982 +0000 UTC m=+1247.929092849" Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.780945 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" podStartSLOduration=4.780918716 podStartE2EDuration="4.780918716s" podCreationTimestamp="2025-11-29 01:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:31:44.775905722 +0000 UTC m=+1247.948055589" watchObservedRunningTime="2025-11-29 01:31:44.780918716 +0000 UTC m=+1247.953068583" Nov 29 01:31:44 crc kubenswrapper[4749]: I1129 01:31:44.804691 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-5c7gw" podStartSLOduration=3.804668314 podStartE2EDuration="3.804668314s" podCreationTimestamp="2025-11-29 01:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:31:44.802058579 +0000 UTC m=+1247.974208436" watchObservedRunningTime="2025-11-29 01:31:44.804668314 +0000 UTC m=+1247.976818171" Nov 29 01:31:45 crc kubenswrapper[4749]: I1129 01:31:45.092583 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" path="/var/lib/kubelet/pods/3d6a2b30-fdb2-4552-b574-0843137e497b/volumes" Nov 29 01:31:45 crc kubenswrapper[4749]: I1129 01:31:45.094320 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" path="/var/lib/kubelet/pods/7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e/volumes" Nov 29 01:31:45 crc kubenswrapper[4749]: I1129 01:31:45.095547 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b26ba14-29ee-4c24-9781-31921928562a" path="/var/lib/kubelet/pods/9b26ba14-29ee-4c24-9781-31921928562a/volumes" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.657411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:46 crc kubenswrapper[4749]: E1129 01:31:46.657625 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 01:31:46 crc kubenswrapper[4749]: E1129 01:31:46.657646 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 01:31:46 crc kubenswrapper[4749]: E1129 01:31:46.657710 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift podName:9faf9cff-1e0a-4d87-b75e-8899450678a4 nodeName:}" failed. No retries permitted until 2025-11-29 01:31:50.65769247 +0000 UTC m=+1253.829842327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift") pod "swift-storage-0" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4") : configmap "swift-ring-files" not found Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.765284 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jzclg"] Nov 29 01:31:46 crc kubenswrapper[4749]: E1129 01:31:46.766298 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" containerName="dnsmasq-dns" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.766329 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" containerName="dnsmasq-dns" Nov 29 01:31:46 crc kubenswrapper[4749]: E1129 01:31:46.766347 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" containerName="init" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.766357 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" containerName="init" Nov 29 01:31:46 crc kubenswrapper[4749]: E1129 01:31:46.766377 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" containerName="dnsmasq-dns" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.766385 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" containerName="dnsmasq-dns" Nov 29 01:31:46 crc kubenswrapper[4749]: E1129 01:31:46.766409 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" containerName="init" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.766418 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" containerName="init" Nov 29 01:31:46 crc kubenswrapper[4749]: E1129 01:31:46.766453 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b26ba14-29ee-4c24-9781-31921928562a" containerName="init" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.766463 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b26ba14-29ee-4c24-9781-31921928562a" containerName="init" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.766931 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6a2b30-fdb2-4552-b574-0843137e497b" containerName="dnsmasq-dns" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.766962 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb31b93-8853-4eb4-a2b5-e8dec5c1c55e" containerName="dnsmasq-dns" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.766987 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b26ba14-29ee-4c24-9781-31921928562a" containerName="init" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.768409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.771635 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.771658 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.771946 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.792627 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jzclg"] Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.862818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef3ffdde-1c21-47d4-9a07-a84008344e08-etc-swift\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.862951 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-dispersionconf\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.863189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-swiftconf\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.863349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-combined-ca-bundle\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.863659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-ring-data-devices\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.863827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-scripts\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.863889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh8tq\" (UniqueName: \"kubernetes.io/projected/ef3ffdde-1c21-47d4-9a07-a84008344e08-kube-api-access-mh8tq\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.965503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef3ffdde-1c21-47d4-9a07-a84008344e08-etc-swift\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.965649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-dispersionconf\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.965748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-swiftconf\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.965792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-combined-ca-bundle\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.965896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-ring-data-devices\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.965974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-scripts\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.966006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh8tq\" (UniqueName: \"kubernetes.io/projected/ef3ffdde-1c21-47d4-9a07-a84008344e08-kube-api-access-mh8tq\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.966485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef3ffdde-1c21-47d4-9a07-a84008344e08-etc-swift\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.968176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-ring-data-devices\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.968316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-scripts\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.977421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-dispersionconf\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.978435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-combined-ca-bundle\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:46 crc kubenswrapper[4749]: I1129 01:31:46.981964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-swiftconf\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:47 crc kubenswrapper[4749]: I1129 01:31:47.001330 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh8tq\" (UniqueName: \"kubernetes.io/projected/ef3ffdde-1c21-47d4-9a07-a84008344e08-kube-api-access-mh8tq\") pod \"swift-ring-rebalance-jzclg\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:47 crc kubenswrapper[4749]: I1129 01:31:47.090813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:31:47 crc kubenswrapper[4749]: I1129 01:31:47.549608 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jzclg"] Nov 29 01:31:47 crc kubenswrapper[4749]: I1129 01:31:47.780814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jzclg" event={"ID":"ef3ffdde-1c21-47d4-9a07-a84008344e08","Type":"ContainerStarted","Data":"30db410dc0d07d7207be4d33b2a3c84ea14af7a16a81d151231c9fb873bdd020"} Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.217227 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6a3a-account-create-update-h42h7"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.218904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.223530 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.252983 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6a3a-account-create-update-h42h7"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.288588 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-q6wqg"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.289896 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.295919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q6wqg"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.315947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-operator-scripts\") pod \"keystone-6a3a-account-create-update-h42h7\" (UID: \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\") " pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.316016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk8zd\" (UniqueName: \"kubernetes.io/projected/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-kube-api-access-rk8zd\") pod \"keystone-6a3a-account-create-update-h42h7\" (UID: \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\") " pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.417494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk8zd\" (UniqueName: \"kubernetes.io/projected/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-kube-api-access-rk8zd\") pod \"keystone-6a3a-account-create-update-h42h7\" (UID: \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\") " pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.418019 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkvs6\" (UniqueName: \"kubernetes.io/projected/91b13a5c-d047-40da-8a7f-87debe2da732-kube-api-access-vkvs6\") pod \"keystone-db-create-q6wqg\" (UID: \"91b13a5c-d047-40da-8a7f-87debe2da732\") " pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.418117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b13a5c-d047-40da-8a7f-87debe2da732-operator-scripts\") pod \"keystone-db-create-q6wqg\" (UID: \"91b13a5c-d047-40da-8a7f-87debe2da732\") " pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.418241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-operator-scripts\") pod \"keystone-6a3a-account-create-update-h42h7\" (UID: \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\") " pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.419303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-operator-scripts\") pod \"keystone-6a3a-account-create-update-h42h7\" (UID: \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\") " pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.444911 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk8zd\" (UniqueName: \"kubernetes.io/projected/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-kube-api-access-rk8zd\") pod \"keystone-6a3a-account-create-update-h42h7\" (UID: \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\") " pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.465726 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xrwj6"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.466875 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.487534 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xrwj6"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.519340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkvs6\" (UniqueName: \"kubernetes.io/projected/91b13a5c-d047-40da-8a7f-87debe2da732-kube-api-access-vkvs6\") pod \"keystone-db-create-q6wqg\" (UID: \"91b13a5c-d047-40da-8a7f-87debe2da732\") " pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.519413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b13a5c-d047-40da-8a7f-87debe2da732-operator-scripts\") pod \"keystone-db-create-q6wqg\" (UID: \"91b13a5c-d047-40da-8a7f-87debe2da732\") " pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.519502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m846t\" (UniqueName: \"kubernetes.io/projected/c9b1aed7-5e53-49ac-917c-adfacd5013fb-kube-api-access-m846t\") pod \"placement-db-create-xrwj6\" (UID: \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\") " pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.519587 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1aed7-5e53-49ac-917c-adfacd5013fb-operator-scripts\") pod \"placement-db-create-xrwj6\" (UID: \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\") " pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.521039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b13a5c-d047-40da-8a7f-87debe2da732-operator-scripts\") pod \"keystone-db-create-q6wqg\" (UID: \"91b13a5c-d047-40da-8a7f-87debe2da732\") " pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.549806 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.551253 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkvs6\" (UniqueName: \"kubernetes.io/projected/91b13a5c-d047-40da-8a7f-87debe2da732-kube-api-access-vkvs6\") pod \"keystone-db-create-q6wqg\" (UID: \"91b13a5c-d047-40da-8a7f-87debe2da732\") " pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.615154 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.623704 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m846t\" (UniqueName: \"kubernetes.io/projected/c9b1aed7-5e53-49ac-917c-adfacd5013fb-kube-api-access-m846t\") pod \"placement-db-create-xrwj6\" (UID: \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\") " pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.623901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1aed7-5e53-49ac-917c-adfacd5013fb-operator-scripts\") pod \"placement-db-create-xrwj6\" (UID: \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\") " pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.624761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1aed7-5e53-49ac-917c-adfacd5013fb-operator-scripts\") pod \"placement-db-create-xrwj6\" (UID: \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\") " pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.627526 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7db1-account-create-update-r9wsv"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.629473 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.632617 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.635787 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7db1-account-create-update-r9wsv"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.641965 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m846t\" (UniqueName: \"kubernetes.io/projected/c9b1aed7-5e53-49ac-917c-adfacd5013fb-kube-api-access-m846t\") pod \"placement-db-create-xrwj6\" (UID: \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\") " pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.728181 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7dpw\" (UniqueName: \"kubernetes.io/projected/e0133422-f6fd-44ca-b8d0-d9e0696aad80-kube-api-access-k7dpw\") pod \"placement-7db1-account-create-update-r9wsv\" (UID: \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\") " pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.728266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0133422-f6fd-44ca-b8d0-d9e0696aad80-operator-scripts\") pod \"placement-7db1-account-create-update-r9wsv\" (UID: \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\") " pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.788615 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jm2kl"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.789985 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.799510 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jm2kl"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.811658 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.830265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-operator-scripts\") pod \"glance-db-create-jm2kl\" (UID: \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\") " pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.830335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7dpw\" (UniqueName: \"kubernetes.io/projected/e0133422-f6fd-44ca-b8d0-d9e0696aad80-kube-api-access-k7dpw\") pod \"placement-7db1-account-create-update-r9wsv\" (UID: \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\") " pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.830360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0133422-f6fd-44ca-b8d0-d9e0696aad80-operator-scripts\") pod \"placement-7db1-account-create-update-r9wsv\" (UID: \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\") " pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.830416 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vnnc\" (UniqueName: \"kubernetes.io/projected/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-kube-api-access-7vnnc\") pod \"glance-db-create-jm2kl\" (UID: \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\") " pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.831090 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0133422-f6fd-44ca-b8d0-d9e0696aad80-operator-scripts\") pod \"placement-7db1-account-create-update-r9wsv\" (UID: \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\") " pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.850135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7dpw\" (UniqueName: \"kubernetes.io/projected/e0133422-f6fd-44ca-b8d0-d9e0696aad80-kube-api-access-k7dpw\") pod \"placement-7db1-account-create-update-r9wsv\" (UID: \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\") " pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.908457 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-261d-account-create-update-j6mzs"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.909725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.912247 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.918323 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-261d-account-create-update-j6mzs"] Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.932175 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vnnc\" (UniqueName: \"kubernetes.io/projected/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-kube-api-access-7vnnc\") pod \"glance-db-create-jm2kl\" (UID: \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\") " pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.932308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-operator-scripts\") pod \"glance-db-create-jm2kl\" (UID: \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\") " pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.933066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-operator-scripts\") pod \"glance-db-create-jm2kl\" (UID: \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\") " pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:49 crc kubenswrapper[4749]: I1129 01:31:49.958873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vnnc\" (UniqueName: \"kubernetes.io/projected/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-kube-api-access-7vnnc\") pod \"glance-db-create-jm2kl\" (UID: \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\") " pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.034089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-operator-scripts\") pod \"glance-261d-account-create-update-j6mzs\" (UID: \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\") " pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.034139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sff58\" (UniqueName: \"kubernetes.io/projected/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-kube-api-access-sff58\") pod \"glance-261d-account-create-update-j6mzs\" (UID: \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\") " pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.054627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.101342 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6a3a-account-create-update-h42h7"] Nov 29 01:31:50 crc kubenswrapper[4749]: W1129 01:31:50.105675 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb7bb7bc_84d7_4d67_ba9d_fd61bbfc973a.slice/crio-d81a99f4353308dc18c1a77a0c9cfa350aae50e6db3b97689385a76bd9ab34fe WatchSource:0}: Error finding container d81a99f4353308dc18c1a77a0c9cfa350aae50e6db3b97689385a76bd9ab34fe: Status 404 returned error can't find the container with id d81a99f4353308dc18c1a77a0c9cfa350aae50e6db3b97689385a76bd9ab34fe Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.118323 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.136313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-operator-scripts\") pod \"glance-261d-account-create-update-j6mzs\" (UID: \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\") " pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.136405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sff58\" (UniqueName: \"kubernetes.io/projected/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-kube-api-access-sff58\") pod \"glance-261d-account-create-update-j6mzs\" (UID: \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\") " pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.137228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-operator-scripts\") pod \"glance-261d-account-create-update-j6mzs\" (UID: \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\") " pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.154838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sff58\" (UniqueName: \"kubernetes.io/projected/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-kube-api-access-sff58\") pod \"glance-261d-account-create-update-j6mzs\" (UID: \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\") " pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.188248 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q6wqg"] Nov 29 01:31:50 crc kubenswrapper[4749]: W1129 01:31:50.189857 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91b13a5c_d047_40da_8a7f_87debe2da732.slice/crio-945866d324194d6607aeabf23dac8f645d8d227383c02555566ef826e26a1d6c WatchSource:0}: Error finding container 945866d324194d6607aeabf23dac8f645d8d227383c02555566ef826e26a1d6c: Status 404 returned error can't find the container with id 945866d324194d6607aeabf23dac8f645d8d227383c02555566ef826e26a1d6c Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.234826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.310451 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xrwj6"] Nov 29 01:31:50 crc kubenswrapper[4749]: W1129 01:31:50.323071 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9b1aed7_5e53_49ac_917c_adfacd5013fb.slice/crio-cd6972ea887e1de6f0cc1bf9d00fa4d10fcc494b6631e8e348aca225e882b3f4 WatchSource:0}: Error finding container cd6972ea887e1de6f0cc1bf9d00fa4d10fcc494b6631e8e348aca225e882b3f4: Status 404 returned error can't find the container with id cd6972ea887e1de6f0cc1bf9d00fa4d10fcc494b6631e8e348aca225e882b3f4 Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.529637 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7db1-account-create-update-r9wsv"] Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.543593 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-261d-account-create-update-j6mzs"] Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.628342 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jm2kl"] Nov 29 01:31:50 crc kubenswrapper[4749]: W1129 01:31:50.641321 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfe7c0f3_6d3d_4117_af54_c7c79fd09d47.slice/crio-ec807842cb8205669a4ffe18829e870144f4778e103d1082c6d53c854ff8dd29 WatchSource:0}: Error finding container ec807842cb8205669a4ffe18829e870144f4778e103d1082c6d53c854ff8dd29: Status 404 returned error can't find the container with id ec807842cb8205669a4ffe18829e870144f4778e103d1082c6d53c854ff8dd29 Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.750829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:50 crc kubenswrapper[4749]: E1129 01:31:50.750986 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 01:31:50 crc kubenswrapper[4749]: E1129 01:31:50.751006 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 01:31:50 crc kubenswrapper[4749]: E1129 01:31:50.751053 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift podName:9faf9cff-1e0a-4d87-b75e-8899450678a4 nodeName:}" failed. No retries permitted until 2025-11-29 01:31:58.751039805 +0000 UTC m=+1261.923189652 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift") pod "swift-storage-0" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4") : configmap "swift-ring-files" not found Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.815965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db1-account-create-update-r9wsv" event={"ID":"e0133422-f6fd-44ca-b8d0-d9e0696aad80","Type":"ContainerStarted","Data":"e4273792b3dc8636ef89462cf1b1b93f5b403f589624f9f0c04332f9c4b4ef5c"} Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.817671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jm2kl" event={"ID":"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47","Type":"ContainerStarted","Data":"ec807842cb8205669a4ffe18829e870144f4778e103d1082c6d53c854ff8dd29"} Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.819576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a3a-account-create-update-h42h7" event={"ID":"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a","Type":"ContainerStarted","Data":"d81a99f4353308dc18c1a77a0c9cfa350aae50e6db3b97689385a76bd9ab34fe"} Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.820954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-261d-account-create-update-j6mzs" event={"ID":"2a17ee11-c21e-4edd-b1cf-8ba27bae166c","Type":"ContainerStarted","Data":"86f39bef6ad8370aa4c7a2b1c52956a19614d65ea6e164dca92b1678608e8954"} Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.822497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q6wqg" event={"ID":"91b13a5c-d047-40da-8a7f-87debe2da732","Type":"ContainerStarted","Data":"945866d324194d6607aeabf23dac8f645d8d227383c02555566ef826e26a1d6c"} Nov 29 01:31:50 crc kubenswrapper[4749]: I1129 01:31:50.823546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrwj6" event={"ID":"c9b1aed7-5e53-49ac-917c-adfacd5013fb","Type":"ContainerStarted","Data":"cd6972ea887e1de6f0cc1bf9d00fa4d10fcc494b6631e8e348aca225e882b3f4"} Nov 29 01:31:51 crc kubenswrapper[4749]: I1129 01:31:51.690406 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:52 crc kubenswrapper[4749]: I1129 01:31:52.063485 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:31:52 crc kubenswrapper[4749]: I1129 01:31:52.125191 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hgc22"] Nov 29 01:31:52 crc kubenswrapper[4749]: I1129 01:31:52.128798 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerName="dnsmasq-dns" containerID="cri-o://99755c868763f972128f21ca651af421ac1e461e7cfbe7a329adf4cc6dd1628d" gracePeriod=10 Nov 29 01:31:54 crc kubenswrapper[4749]: I1129 01:31:54.861668 4749 generic.go:334] "Generic (PLEG): container finished" podID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerID="99755c868763f972128f21ca651af421ac1e461e7cfbe7a329adf4cc6dd1628d" exitCode=0 Nov 29 01:31:54 crc kubenswrapper[4749]: I1129 01:31:54.861724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" event={"ID":"6d6b6ec0-3dec-4d20-919f-128342cb1eb1","Type":"ContainerDied","Data":"99755c868763f972128f21ca651af421ac1e461e7cfbe7a329adf4cc6dd1628d"} Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.876820 4749 generic.go:334] "Generic (PLEG): container finished" podID="bfe7c0f3-6d3d-4117-af54-c7c79fd09d47" containerID="f7ced13b7586036aaf8e13a7f09182c81b2e554a8054b19fe03c90cd5ca2e42b" exitCode=0 Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.876957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jm2kl" event={"ID":"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47","Type":"ContainerDied","Data":"f7ced13b7586036aaf8e13a7f09182c81b2e554a8054b19fe03c90cd5ca2e42b"} Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.879483 4749 generic.go:334] "Generic (PLEG): container finished" podID="cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a" containerID="1e24c52665f61a339ea805c9382b78f4544ad14785f2be7457cd2e7fb68f1536" exitCode=0 Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.879616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a3a-account-create-update-h42h7" event={"ID":"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a","Type":"ContainerDied","Data":"1e24c52665f61a339ea805c9382b78f4544ad14785f2be7457cd2e7fb68f1536"} Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.883212 4749 generic.go:334] "Generic (PLEG): container finished" podID="2a17ee11-c21e-4edd-b1cf-8ba27bae166c" containerID="4443b1791cbae8653c74d15ac09fef4fc47d04d4da5dd53b897e42b9925d9a8e" exitCode=0 Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.883345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-261d-account-create-update-j6mzs" event={"ID":"2a17ee11-c21e-4edd-b1cf-8ba27bae166c","Type":"ContainerDied","Data":"4443b1791cbae8653c74d15ac09fef4fc47d04d4da5dd53b897e42b9925d9a8e"} Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.885332 4749 generic.go:334] "Generic (PLEG): container finished" podID="91b13a5c-d047-40da-8a7f-87debe2da732" containerID="c77043553ae219c84d2c309f4a2aa4114698fbead2ddaa374f410e948b3b3b37" exitCode=0 Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.885439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q6wqg" event={"ID":"91b13a5c-d047-40da-8a7f-87debe2da732","Type":"ContainerDied","Data":"c77043553ae219c84d2c309f4a2aa4114698fbead2ddaa374f410e948b3b3b37"} Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.892128 4749 generic.go:334] "Generic (PLEG): container finished" podID="c9b1aed7-5e53-49ac-917c-adfacd5013fb" containerID="264f3785ab82460c75c592ee847ad04af28a33b846be9d53ed84791d7f08ac24" exitCode=0 Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.893108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrwj6" event={"ID":"c9b1aed7-5e53-49ac-917c-adfacd5013fb","Type":"ContainerDied","Data":"264f3785ab82460c75c592ee847ad04af28a33b846be9d53ed84791d7f08ac24"} Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.899560 4749 generic.go:334] "Generic (PLEG): container finished" podID="e0133422-f6fd-44ca-b8d0-d9e0696aad80" containerID="877b8b5eb182d8827c6031d04c1c5302137ddc4813ede6360099d452ed4c9288" exitCode=0 Nov 29 01:31:55 crc kubenswrapper[4749]: I1129 01:31:55.899659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db1-account-create-update-r9wsv" event={"ID":"e0133422-f6fd-44ca-b8d0-d9e0696aad80","Type":"ContainerDied","Data":"877b8b5eb182d8827c6031d04c1c5302137ddc4813ede6360099d452ed4c9288"} Nov 29 01:31:56 crc kubenswrapper[4749]: I1129 01:31:56.516571 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.859891 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.909413 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sff58\" (UniqueName: \"kubernetes.io/projected/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-kube-api-access-sff58\") pod \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\" (UID: \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\") " Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.909475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-operator-scripts\") pod \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\" (UID: \"2a17ee11-c21e-4edd-b1cf-8ba27bae166c\") " Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.910736 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a17ee11-c21e-4edd-b1cf-8ba27bae166c" (UID: "2a17ee11-c21e-4edd-b1cf-8ba27bae166c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.917091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-kube-api-access-sff58" (OuterVolumeSpecName: "kube-api-access-sff58") pod "2a17ee11-c21e-4edd-b1cf-8ba27bae166c" (UID: "2a17ee11-c21e-4edd-b1cf-8ba27bae166c"). InnerVolumeSpecName "kube-api-access-sff58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.939440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" event={"ID":"6d6b6ec0-3dec-4d20-919f-128342cb1eb1","Type":"ContainerDied","Data":"d38299b311ff2f680d22e894e59ac93bc6aa723270c1526c988fc0941a443f58"} Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.939519 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d38299b311ff2f680d22e894e59ac93bc6aa723270c1526c988fc0941a443f58" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.941954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q6wqg" event={"ID":"91b13a5c-d047-40da-8a7f-87debe2da732","Type":"ContainerDied","Data":"945866d324194d6607aeabf23dac8f645d8d227383c02555566ef826e26a1d6c"} Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.942017 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="945866d324194d6607aeabf23dac8f645d8d227383c02555566ef826e26a1d6c" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.944111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xrwj6" event={"ID":"c9b1aed7-5e53-49ac-917c-adfacd5013fb","Type":"ContainerDied","Data":"cd6972ea887e1de6f0cc1bf9d00fa4d10fcc494b6631e8e348aca225e882b3f4"} Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.944143 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd6972ea887e1de6f0cc1bf9d00fa4d10fcc494b6631e8e348aca225e882b3f4" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.949966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7db1-account-create-update-r9wsv" event={"ID":"e0133422-f6fd-44ca-b8d0-d9e0696aad80","Type":"ContainerDied","Data":"e4273792b3dc8636ef89462cf1b1b93f5b403f589624f9f0c04332f9c4b4ef5c"} Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.949987 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4273792b3dc8636ef89462cf1b1b93f5b403f589624f9f0c04332f9c4b4ef5c" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.955607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6a3a-account-create-update-h42h7" event={"ID":"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a","Type":"ContainerDied","Data":"d81a99f4353308dc18c1a77a0c9cfa350aae50e6db3b97689385a76bd9ab34fe"} Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.955647 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d81a99f4353308dc18c1a77a0c9cfa350aae50e6db3b97689385a76bd9ab34fe" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.959664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-261d-account-create-update-j6mzs" event={"ID":"2a17ee11-c21e-4edd-b1cf-8ba27bae166c","Type":"ContainerDied","Data":"86f39bef6ad8370aa4c7a2b1c52956a19614d65ea6e164dca92b1678608e8954"} Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.959709 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86f39bef6ad8370aa4c7a2b1c52956a19614d65ea6e164dca92b1678608e8954" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.959720 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-261d-account-create-update-j6mzs" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.963486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jm2kl" event={"ID":"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47","Type":"ContainerDied","Data":"ec807842cb8205669a4ffe18829e870144f4778e103d1082c6d53c854ff8dd29"} Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.963520 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec807842cb8205669a4ffe18829e870144f4778e103d1082c6d53c854ff8dd29" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.965480 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:57 crc kubenswrapper[4749]: I1129 01:31:57.996551 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.011230 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4q64\" (UniqueName: \"kubernetes.io/projected/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-kube-api-access-p4q64\") pod \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.011307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-nb\") pod \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.011395 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-config\") pod \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.011455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-sb\") pod \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.011589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-dns-svc\") pod \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\" (UID: \"6d6b6ec0-3dec-4d20-919f-128342cb1eb1\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.012027 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.012040 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sff58\" (UniqueName: \"kubernetes.io/projected/2a17ee11-c21e-4edd-b1cf-8ba27bae166c-kube-api-access-sff58\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.015277 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.019649 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-kube-api-access-p4q64" (OuterVolumeSpecName: "kube-api-access-p4q64") pod "6d6b6ec0-3dec-4d20-919f-128342cb1eb1" (UID: "6d6b6ec0-3dec-4d20-919f-128342cb1eb1"). InnerVolumeSpecName "kube-api-access-p4q64". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.024959 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.097467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-config" (OuterVolumeSpecName: "config") pod "6d6b6ec0-3dec-4d20-919f-128342cb1eb1" (UID: "6d6b6ec0-3dec-4d20-919f-128342cb1eb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.098714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d6b6ec0-3dec-4d20-919f-128342cb1eb1" (UID: "6d6b6ec0-3dec-4d20-919f-128342cb1eb1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.104062 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.104974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d6b6ec0-3dec-4d20-919f-128342cb1eb1" (UID: "6d6b6ec0-3dec-4d20-919f-128342cb1eb1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.108577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d6b6ec0-3dec-4d20-919f-128342cb1eb1" (UID: "6d6b6ec0-3dec-4d20-919f-128342cb1eb1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.109523 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1aed7-5e53-49ac-917c-adfacd5013fb-operator-scripts\") pod \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\" (UID: \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m846t\" (UniqueName: \"kubernetes.io/projected/c9b1aed7-5e53-49ac-917c-adfacd5013fb-kube-api-access-m846t\") pod \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\" (UID: \"c9b1aed7-5e53-49ac-917c-adfacd5013fb\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0133422-f6fd-44ca-b8d0-d9e0696aad80-operator-scripts\") pod \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\" (UID: \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk8zd\" (UniqueName: \"kubernetes.io/projected/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-kube-api-access-rk8zd\") pod \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\" (UID: \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113499 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7dpw\" (UniqueName: \"kubernetes.io/projected/e0133422-f6fd-44ca-b8d0-d9e0696aad80-kube-api-access-k7dpw\") pod \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\" (UID: \"e0133422-f6fd-44ca-b8d0-d9e0696aad80\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113545 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-operator-scripts\") pod \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\" (UID: \"cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113909 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113928 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113938 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113948 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4q64\" (UniqueName: \"kubernetes.io/projected/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-kube-api-access-p4q64\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.113956 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d6b6ec0-3dec-4d20-919f-128342cb1eb1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.118761 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b1aed7-5e53-49ac-917c-adfacd5013fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9b1aed7-5e53-49ac-917c-adfacd5013fb" (UID: "c9b1aed7-5e53-49ac-917c-adfacd5013fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.119572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a" (UID: "cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.119986 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0133422-f6fd-44ca-b8d0-d9e0696aad80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0133422-f6fd-44ca-b8d0-d9e0696aad80" (UID: "e0133422-f6fd-44ca-b8d0-d9e0696aad80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.122735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0133422-f6fd-44ca-b8d0-d9e0696aad80-kube-api-access-k7dpw" (OuterVolumeSpecName: "kube-api-access-k7dpw") pod "e0133422-f6fd-44ca-b8d0-d9e0696aad80" (UID: "e0133422-f6fd-44ca-b8d0-d9e0696aad80"). InnerVolumeSpecName "kube-api-access-k7dpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.122866 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-kube-api-access-rk8zd" (OuterVolumeSpecName: "kube-api-access-rk8zd") pod "cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a" (UID: "cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a"). InnerVolumeSpecName "kube-api-access-rk8zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.134508 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b1aed7-5e53-49ac-917c-adfacd5013fb-kube-api-access-m846t" (OuterVolumeSpecName: "kube-api-access-m846t") pod "c9b1aed7-5e53-49ac-917c-adfacd5013fb" (UID: "c9b1aed7-5e53-49ac-917c-adfacd5013fb"). InnerVolumeSpecName "kube-api-access-m846t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.214935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-operator-scripts\") pod \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\" (UID: \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b13a5c-d047-40da-8a7f-87debe2da732-operator-scripts\") pod \"91b13a5c-d047-40da-8a7f-87debe2da732\" (UID: \"91b13a5c-d047-40da-8a7f-87debe2da732\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215273 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vnnc\" (UniqueName: \"kubernetes.io/projected/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-kube-api-access-7vnnc\") pod \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\" (UID: \"bfe7c0f3-6d3d-4117-af54-c7c79fd09d47\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkvs6\" (UniqueName: \"kubernetes.io/projected/91b13a5c-d047-40da-8a7f-87debe2da732-kube-api-access-vkvs6\") pod \"91b13a5c-d047-40da-8a7f-87debe2da732\" (UID: \"91b13a5c-d047-40da-8a7f-87debe2da732\") " Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215486 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bfe7c0f3-6d3d-4117-af54-c7c79fd09d47" (UID: "bfe7c0f3-6d3d-4117-af54-c7c79fd09d47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215893 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215916 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215929 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1aed7-5e53-49ac-917c-adfacd5013fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215939 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m846t\" (UniqueName: \"kubernetes.io/projected/c9b1aed7-5e53-49ac-917c-adfacd5013fb-kube-api-access-m846t\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215953 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0133422-f6fd-44ca-b8d0-d9e0696aad80-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215943 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91b13a5c-d047-40da-8a7f-87debe2da732-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91b13a5c-d047-40da-8a7f-87debe2da732" (UID: "91b13a5c-d047-40da-8a7f-87debe2da732"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.215962 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk8zd\" (UniqueName: \"kubernetes.io/projected/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a-kube-api-access-rk8zd\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.216099 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7dpw\" (UniqueName: \"kubernetes.io/projected/e0133422-f6fd-44ca-b8d0-d9e0696aad80-kube-api-access-k7dpw\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.218425 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-kube-api-access-7vnnc" (OuterVolumeSpecName: "kube-api-access-7vnnc") pod "bfe7c0f3-6d3d-4117-af54-c7c79fd09d47" (UID: "bfe7c0f3-6d3d-4117-af54-c7c79fd09d47"). InnerVolumeSpecName "kube-api-access-7vnnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.220550 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b13a5c-d047-40da-8a7f-87debe2da732-kube-api-access-vkvs6" (OuterVolumeSpecName: "kube-api-access-vkvs6") pod "91b13a5c-d047-40da-8a7f-87debe2da732" (UID: "91b13a5c-d047-40da-8a7f-87debe2da732"). InnerVolumeSpecName "kube-api-access-vkvs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.317709 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91b13a5c-d047-40da-8a7f-87debe2da732-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.317756 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vnnc\" (UniqueName: \"kubernetes.io/projected/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47-kube-api-access-7vnnc\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.317773 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkvs6\" (UniqueName: \"kubernetes.io/projected/91b13a5c-d047-40da-8a7f-87debe2da732-kube-api-access-vkvs6\") on node \"crc\" DevicePath \"\"" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.827947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:31:58 crc kubenswrapper[4749]: E1129 01:31:58.828185 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 29 01:31:58 crc kubenswrapper[4749]: E1129 01:31:58.828241 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 29 01:31:58 crc kubenswrapper[4749]: E1129 01:31:58.828323 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift podName:9faf9cff-1e0a-4d87-b75e-8899450678a4 nodeName:}" failed. No retries permitted until 2025-11-29 01:32:14.828295335 +0000 UTC m=+1278.000445222 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift") pod "swift-storage-0" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4") : configmap "swift-ring-files" not found Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.974672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jzclg" event={"ID":"ef3ffdde-1c21-47d4-9a07-a84008344e08","Type":"ContainerStarted","Data":"35029c3554395ea9426e60b18014966674f056157d89d9bbcc57908c5a3991cc"} Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.974711 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.974698 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7db1-account-create-update-r9wsv" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.974746 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xrwj6" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.974787 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6a3a-account-create-update-h42h7" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.975016 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jm2kl" Nov 29 01:31:58 crc kubenswrapper[4749]: I1129 01:31:58.975101 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q6wqg" Nov 29 01:31:59 crc kubenswrapper[4749]: I1129 01:31:59.017165 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jzclg" podStartSLOduration=2.916352683 podStartE2EDuration="13.017139982s" podCreationTimestamp="2025-11-29 01:31:46 +0000 UTC" firstStartedPulling="2025-11-29 01:31:47.558594764 +0000 UTC m=+1250.730744631" lastFinishedPulling="2025-11-29 01:31:57.659382083 +0000 UTC m=+1260.831531930" observedRunningTime="2025-11-29 01:31:59.011908023 +0000 UTC m=+1262.184057920" watchObservedRunningTime="2025-11-29 01:31:59.017139982 +0000 UTC m=+1262.189289879" Nov 29 01:31:59 crc kubenswrapper[4749]: I1129 01:31:59.056394 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hgc22"] Nov 29 01:31:59 crc kubenswrapper[4749]: I1129 01:31:59.104016 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-hgc22"] Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.107917 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mz2m2"] Nov 29 01:32:00 crc kubenswrapper[4749]: E1129 01:32:00.108756 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerName="init" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.108773 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerName="init" Nov 29 01:32:00 crc kubenswrapper[4749]: E1129 01:32:00.108784 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe7c0f3-6d3d-4117-af54-c7c79fd09d47" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.108793 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe7c0f3-6d3d-4117-af54-c7c79fd09d47" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: E1129 01:32:00.108813 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b1aed7-5e53-49ac-917c-adfacd5013fb" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.108822 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b1aed7-5e53-49ac-917c-adfacd5013fb" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: E1129 01:32:00.108837 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a17ee11-c21e-4edd-b1cf-8ba27bae166c" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.108845 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a17ee11-c21e-4edd-b1cf-8ba27bae166c" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: E1129 01:32:00.108859 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerName="dnsmasq-dns" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.108866 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerName="dnsmasq-dns" Nov 29 01:32:00 crc kubenswrapper[4749]: E1129 01:32:00.108890 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b13a5c-d047-40da-8a7f-87debe2da732" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.108901 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b13a5c-d047-40da-8a7f-87debe2da732" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: E1129 01:32:00.108920 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.108931 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: E1129 01:32:00.108946 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0133422-f6fd-44ca-b8d0-d9e0696aad80" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.108956 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0133422-f6fd-44ca-b8d0-d9e0696aad80" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.109228 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a17ee11-c21e-4edd-b1cf-8ba27bae166c" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.109275 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b13a5c-d047-40da-8a7f-87debe2da732" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.109307 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerName="dnsmasq-dns" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.109333 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.109358 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe7c0f3-6d3d-4117-af54-c7c79fd09d47" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.109387 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0133422-f6fd-44ca-b8d0-d9e0696aad80" containerName="mariadb-account-create-update" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.109423 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b1aed7-5e53-49ac-917c-adfacd5013fb" containerName="mariadb-database-create" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.110080 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.113766 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-brkw4" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.113880 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.124794 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mz2m2"] Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.161602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-db-sync-config-data\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.162640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-config-data\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.162740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-combined-ca-bundle\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.162908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqx8p\" (UniqueName: \"kubernetes.io/projected/a76ca241-80b1-4019-b42a-12ecc908016c-kube-api-access-qqx8p\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.265258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-db-sync-config-data\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.265425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-config-data\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.265475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-combined-ca-bundle\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.265543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqx8p\" (UniqueName: \"kubernetes.io/projected/a76ca241-80b1-4019-b42a-12ecc908016c-kube-api-access-qqx8p\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.270337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-config-data\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.270996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-db-sync-config-data\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.272133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-combined-ca-bundle\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.294510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqx8p\" (UniqueName: \"kubernetes.io/projected/a76ca241-80b1-4019-b42a-12ecc908016c-kube-api-access-qqx8p\") pod \"glance-db-sync-mz2m2\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:00 crc kubenswrapper[4749]: I1129 01:32:00.433605 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:01 crc kubenswrapper[4749]: I1129 01:32:01.065951 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mz2m2"] Nov 29 01:32:01 crc kubenswrapper[4749]: I1129 01:32:01.092905 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" path="/var/lib/kubelet/pods/6d6b6ec0-3dec-4d20-919f-128342cb1eb1/volumes" Nov 29 01:32:01 crc kubenswrapper[4749]: I1129 01:32:01.688840 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-hgc22" podUID="6d6b6ec0-3dec-4d20-919f-128342cb1eb1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Nov 29 01:32:02 crc kubenswrapper[4749]: I1129 01:32:02.002191 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mz2m2" event={"ID":"a76ca241-80b1-4019-b42a-12ecc908016c","Type":"ContainerStarted","Data":"c6e8cb192ab63c3d889849caf339d856ab7022feedd7c7f323a92714994dc1c3"} Nov 29 01:32:04 crc kubenswrapper[4749]: I1129 01:32:04.922251 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9lxvg" podUID="65fd8520-689b-4f93-850e-bac0cec97025" containerName="ovn-controller" probeResult="failure" output=< Nov 29 01:32:04 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 29 01:32:04 crc kubenswrapper[4749]: > Nov 29 01:32:05 crc kubenswrapper[4749]: I1129 01:32:05.029971 4749 generic.go:334] "Generic (PLEG): container finished" podID="ef3ffdde-1c21-47d4-9a07-a84008344e08" containerID="35029c3554395ea9426e60b18014966674f056157d89d9bbcc57908c5a3991cc" exitCode=0 Nov 29 01:32:05 crc kubenswrapper[4749]: I1129 01:32:05.030038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jzclg" event={"ID":"ef3ffdde-1c21-47d4-9a07-a84008344e08","Type":"ContainerDied","Data":"35029c3554395ea9426e60b18014966674f056157d89d9bbcc57908c5a3991cc"} Nov 29 01:32:09 crc kubenswrapper[4749]: I1129 01:32:09.935449 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9lxvg" podUID="65fd8520-689b-4f93-850e-bac0cec97025" containerName="ovn-controller" probeResult="failure" output=< Nov 29 01:32:09 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 29 01:32:09 crc kubenswrapper[4749]: > Nov 29 01:32:09 crc kubenswrapper[4749]: I1129 01:32:09.935626 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:32:09 crc kubenswrapper[4749]: I1129 01:32:09.936093 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.182916 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9lxvg-config-g7cww"] Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.184445 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.187220 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.190917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9lxvg-config-g7cww"] Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.279926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run-ovn\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.280006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-scripts\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.280090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7b8\" (UniqueName: \"kubernetes.io/projected/6823cc6a-6938-4a99-96df-14b90dfd29d7-kube-api-access-bn7b8\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.280148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-log-ovn\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.280493 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.280601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-additional-scripts\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.383633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-log-ovn\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.383752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.383778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-additional-scripts\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.383807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run-ovn\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.383840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-scripts\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.383861 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7b8\" (UniqueName: \"kubernetes.io/projected/6823cc6a-6938-4a99-96df-14b90dfd29d7-kube-api-access-bn7b8\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.384366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run-ovn\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.384392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-log-ovn\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.384453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.384891 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-additional-scripts\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.386415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-scripts\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.404508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7b8\" (UniqueName: \"kubernetes.io/projected/6823cc6a-6938-4a99-96df-14b90dfd29d7-kube-api-access-bn7b8\") pod \"ovn-controller-9lxvg-config-g7cww\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:10 crc kubenswrapper[4749]: I1129 01:32:10.526754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.103788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jzclg" event={"ID":"ef3ffdde-1c21-47d4-9a07-a84008344e08","Type":"ContainerDied","Data":"30db410dc0d07d7207be4d33b2a3c84ea14af7a16a81d151231c9fb873bdd020"} Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.104109 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30db410dc0d07d7207be4d33b2a3c84ea14af7a16a81d151231c9fb873bdd020" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.108288 4749 generic.go:334] "Generic (PLEG): container finished" podID="31a44203-fd94-4eb4-952f-d54a5c577095" containerID="b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e" exitCode=0 Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.108351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31a44203-fd94-4eb4-952f-d54a5c577095","Type":"ContainerDied","Data":"b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e"} Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.287923 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.426251 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-ring-data-devices\") pod \"ef3ffdde-1c21-47d4-9a07-a84008344e08\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.426322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-combined-ca-bundle\") pod \"ef3ffdde-1c21-47d4-9a07-a84008344e08\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.426355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-swiftconf\") pod \"ef3ffdde-1c21-47d4-9a07-a84008344e08\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.426389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-scripts\") pod \"ef3ffdde-1c21-47d4-9a07-a84008344e08\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.426411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef3ffdde-1c21-47d4-9a07-a84008344e08-etc-swift\") pod \"ef3ffdde-1c21-47d4-9a07-a84008344e08\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.426437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-dispersionconf\") pod \"ef3ffdde-1c21-47d4-9a07-a84008344e08\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.426480 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh8tq\" (UniqueName: \"kubernetes.io/projected/ef3ffdde-1c21-47d4-9a07-a84008344e08-kube-api-access-mh8tq\") pod \"ef3ffdde-1c21-47d4-9a07-a84008344e08\" (UID: \"ef3ffdde-1c21-47d4-9a07-a84008344e08\") " Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.428799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ef3ffdde-1c21-47d4-9a07-a84008344e08" (UID: "ef3ffdde-1c21-47d4-9a07-a84008344e08"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.429272 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef3ffdde-1c21-47d4-9a07-a84008344e08-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ef3ffdde-1c21-47d4-9a07-a84008344e08" (UID: "ef3ffdde-1c21-47d4-9a07-a84008344e08"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.434536 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef3ffdde-1c21-47d4-9a07-a84008344e08-kube-api-access-mh8tq" (OuterVolumeSpecName: "kube-api-access-mh8tq") pod "ef3ffdde-1c21-47d4-9a07-a84008344e08" (UID: "ef3ffdde-1c21-47d4-9a07-a84008344e08"). InnerVolumeSpecName "kube-api-access-mh8tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.437752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ef3ffdde-1c21-47d4-9a07-a84008344e08" (UID: "ef3ffdde-1c21-47d4-9a07-a84008344e08"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.449278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-scripts" (OuterVolumeSpecName: "scripts") pod "ef3ffdde-1c21-47d4-9a07-a84008344e08" (UID: "ef3ffdde-1c21-47d4-9a07-a84008344e08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.461862 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef3ffdde-1c21-47d4-9a07-a84008344e08" (UID: "ef3ffdde-1c21-47d4-9a07-a84008344e08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.463911 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ef3ffdde-1c21-47d4-9a07-a84008344e08" (UID: "ef3ffdde-1c21-47d4-9a07-a84008344e08"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.528336 4749 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.528380 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.528396 4749 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.528409 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef3ffdde-1c21-47d4-9a07-a84008344e08-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.528421 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ef3ffdde-1c21-47d4-9a07-a84008344e08-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.528433 4749 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ef3ffdde-1c21-47d4-9a07-a84008344e08-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.528445 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh8tq\" (UniqueName: \"kubernetes.io/projected/ef3ffdde-1c21-47d4-9a07-a84008344e08-kube-api-access-mh8tq\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:12 crc kubenswrapper[4749]: I1129 01:32:12.604802 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9lxvg-config-g7cww"] Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.116809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg-config-g7cww" event={"ID":"6823cc6a-6938-4a99-96df-14b90dfd29d7","Type":"ContainerStarted","Data":"8bde49f653c7a890a3f11428b5610e77b4cf7e9835e8547b818296896403a5aa"} Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.117290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg-config-g7cww" event={"ID":"6823cc6a-6938-4a99-96df-14b90dfd29d7","Type":"ContainerStarted","Data":"d7a00906422a02f94dbd4280fcdf9c134645486e1050c907d7986b79fc8171cf"} Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.119720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mz2m2" event={"ID":"a76ca241-80b1-4019-b42a-12ecc908016c","Type":"ContainerStarted","Data":"9f887fa6c5c95bfb17f5897d64af04248c64f3261379834f006b8fcb67264158"} Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.121413 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jzclg" Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.128469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31a44203-fd94-4eb4-952f-d54a5c577095","Type":"ContainerStarted","Data":"eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1"} Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.128729 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.146290 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9lxvg-config-g7cww" podStartSLOduration=3.146273246 podStartE2EDuration="3.146273246s" podCreationTimestamp="2025-11-29 01:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:13.142650276 +0000 UTC m=+1276.314800143" watchObservedRunningTime="2025-11-29 01:32:13.146273246 +0000 UTC m=+1276.318423103" Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.179027 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.323091812 podStartE2EDuration="1m9.179009757s" podCreationTimestamp="2025-11-29 01:31:04 +0000 UTC" firstStartedPulling="2025-11-29 01:31:06.861023178 +0000 UTC m=+1210.033173035" lastFinishedPulling="2025-11-29 01:31:37.716941083 +0000 UTC m=+1240.889090980" observedRunningTime="2025-11-29 01:32:13.176733341 +0000 UTC m=+1276.348883208" watchObservedRunningTime="2025-11-29 01:32:13.179009757 +0000 UTC m=+1276.351159614" Nov 29 01:32:13 crc kubenswrapper[4749]: I1129 01:32:13.197387 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mz2m2" podStartSLOduration=2.095265263 podStartE2EDuration="13.197365442s" podCreationTimestamp="2025-11-29 01:32:00 +0000 UTC" firstStartedPulling="2025-11-29 01:32:01.082920678 +0000 UTC m=+1264.255070545" lastFinishedPulling="2025-11-29 01:32:12.185020847 +0000 UTC m=+1275.357170724" observedRunningTime="2025-11-29 01:32:13.192366408 +0000 UTC m=+1276.364516255" watchObservedRunningTime="2025-11-29 01:32:13.197365442 +0000 UTC m=+1276.369515309" Nov 29 01:32:14 crc kubenswrapper[4749]: I1129 01:32:14.131104 4749 generic.go:334] "Generic (PLEG): container finished" podID="9a9603fe-72d8-479a-86be-9b914455fba1" containerID="d15407c83f3311f8c3958f6e4bd3c9da53a4bea36613017566ab26cfa4a60437" exitCode=0 Nov 29 01:32:14 crc kubenswrapper[4749]: I1129 01:32:14.131232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a9603fe-72d8-479a-86be-9b914455fba1","Type":"ContainerDied","Data":"d15407c83f3311f8c3958f6e4bd3c9da53a4bea36613017566ab26cfa4a60437"} Nov 29 01:32:14 crc kubenswrapper[4749]: I1129 01:32:14.133900 4749 generic.go:334] "Generic (PLEG): container finished" podID="6823cc6a-6938-4a99-96df-14b90dfd29d7" containerID="8bde49f653c7a890a3f11428b5610e77b4cf7e9835e8547b818296896403a5aa" exitCode=0 Nov 29 01:32:14 crc kubenswrapper[4749]: I1129 01:32:14.133987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg-config-g7cww" event={"ID":"6823cc6a-6938-4a99-96df-14b90dfd29d7","Type":"ContainerDied","Data":"8bde49f653c7a890a3f11428b5610e77b4cf7e9835e8547b818296896403a5aa"} Nov 29 01:32:14 crc kubenswrapper[4749]: I1129 01:32:14.868781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:32:14 crc kubenswrapper[4749]: I1129 01:32:14.892372 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"swift-storage-0\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " pod="openstack/swift-storage-0" Nov 29 01:32:14 crc kubenswrapper[4749]: I1129 01:32:14.918718 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9lxvg" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.056966 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.180568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a9603fe-72d8-479a-86be-9b914455fba1","Type":"ContainerStarted","Data":"00208c9ed91f795bee62848698e8c46182c049d63a9adf626a2bc73dc90a56e8"} Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.182347 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.218363 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371966.63643 podStartE2EDuration="1m10.218344637s" podCreationTimestamp="2025-11-29 01:31:05 +0000 UTC" firstStartedPulling="2025-11-29 01:31:07.148023767 +0000 UTC m=+1210.320173624" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:15.214484632 +0000 UTC m=+1278.386634499" watchObservedRunningTime="2025-11-29 01:32:15.218344637 +0000 UTC m=+1278.390494494" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.549242 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.585697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run-ovn\") pod \"6823cc6a-6938-4a99-96df-14b90dfd29d7\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.585756 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-log-ovn\") pod \"6823cc6a-6938-4a99-96df-14b90dfd29d7\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.585804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run\") pod \"6823cc6a-6938-4a99-96df-14b90dfd29d7\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.585912 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-additional-scripts\") pod \"6823cc6a-6938-4a99-96df-14b90dfd29d7\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.586000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-scripts\") pod \"6823cc6a-6938-4a99-96df-14b90dfd29d7\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.586018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn7b8\" (UniqueName: \"kubernetes.io/projected/6823cc6a-6938-4a99-96df-14b90dfd29d7-kube-api-access-bn7b8\") pod \"6823cc6a-6938-4a99-96df-14b90dfd29d7\" (UID: \"6823cc6a-6938-4a99-96df-14b90dfd29d7\") " Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.587040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6823cc6a-6938-4a99-96df-14b90dfd29d7" (UID: "6823cc6a-6938-4a99-96df-14b90dfd29d7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.587095 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6823cc6a-6938-4a99-96df-14b90dfd29d7" (UID: "6823cc6a-6938-4a99-96df-14b90dfd29d7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.587118 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run" (OuterVolumeSpecName: "var-run") pod "6823cc6a-6938-4a99-96df-14b90dfd29d7" (UID: "6823cc6a-6938-4a99-96df-14b90dfd29d7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.587696 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6823cc6a-6938-4a99-96df-14b90dfd29d7" (UID: "6823cc6a-6938-4a99-96df-14b90dfd29d7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.588388 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-scripts" (OuterVolumeSpecName: "scripts") pod "6823cc6a-6938-4a99-96df-14b90dfd29d7" (UID: "6823cc6a-6938-4a99-96df-14b90dfd29d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.606564 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6823cc6a-6938-4a99-96df-14b90dfd29d7-kube-api-access-bn7b8" (OuterVolumeSpecName: "kube-api-access-bn7b8") pod "6823cc6a-6938-4a99-96df-14b90dfd29d7" (UID: "6823cc6a-6938-4a99-96df-14b90dfd29d7"). InnerVolumeSpecName "kube-api-access-bn7b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.687282 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.687322 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6823cc6a-6938-4a99-96df-14b90dfd29d7-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.687331 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn7b8\" (UniqueName: \"kubernetes.io/projected/6823cc6a-6938-4a99-96df-14b90dfd29d7-kube-api-access-bn7b8\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.687340 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.687348 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.687356 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6823cc6a-6938-4a99-96df-14b90dfd29d7-var-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.698981 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 29 01:32:15 crc kubenswrapper[4749]: W1129 01:32:15.700084 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9faf9cff_1e0a_4d87_b75e_8899450678a4.slice/crio-b4e612f85ce7e3f4849419fc2aed4f1ce08e272769ae1279cbbea6357590e912 WatchSource:0}: Error finding container b4e612f85ce7e3f4849419fc2aed4f1ce08e272769ae1279cbbea6357590e912: Status 404 returned error can't find the container with id b4e612f85ce7e3f4849419fc2aed4f1ce08e272769ae1279cbbea6357590e912 Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.717021 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9lxvg-config-g7cww"] Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.723861 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9lxvg-config-g7cww"] Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.851799 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9lxvg-config-hfbwh"] Nov 29 01:32:15 crc kubenswrapper[4749]: E1129 01:32:15.852130 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6823cc6a-6938-4a99-96df-14b90dfd29d7" containerName="ovn-config" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.852142 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6823cc6a-6938-4a99-96df-14b90dfd29d7" containerName="ovn-config" Nov 29 01:32:15 crc kubenswrapper[4749]: E1129 01:32:15.852165 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef3ffdde-1c21-47d4-9a07-a84008344e08" containerName="swift-ring-rebalance" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.852171 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef3ffdde-1c21-47d4-9a07-a84008344e08" containerName="swift-ring-rebalance" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.852351 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef3ffdde-1c21-47d4-9a07-a84008344e08" containerName="swift-ring-rebalance" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.852375 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6823cc6a-6938-4a99-96df-14b90dfd29d7" containerName="ovn-config" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.853103 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.872363 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9lxvg-config-hfbwh"] Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.889751 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-scripts\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.889796 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d25gb\" (UniqueName: \"kubernetes.io/projected/f496338b-ca41-4f60-9411-8d7b8b6672c6-kube-api-access-d25gb\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.889826 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-additional-scripts\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.889865 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-log-ovn\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.889894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.889914 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run-ovn\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.991282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-scripts\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.991337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d25gb\" (UniqueName: \"kubernetes.io/projected/f496338b-ca41-4f60-9411-8d7b8b6672c6-kube-api-access-d25gb\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.991368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-additional-scripts\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.991406 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-log-ovn\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.991432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.991456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run-ovn\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.991948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run-ovn\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.991950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-log-ovn\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.992094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.992589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-additional-scripts\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:15 crc kubenswrapper[4749]: I1129 01:32:15.993501 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-scripts\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:16 crc kubenswrapper[4749]: I1129 01:32:16.011878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d25gb\" (UniqueName: \"kubernetes.io/projected/f496338b-ca41-4f60-9411-8d7b8b6672c6-kube-api-access-d25gb\") pod \"ovn-controller-9lxvg-config-hfbwh\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:16 crc kubenswrapper[4749]: I1129 01:32:16.173100 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:16 crc kubenswrapper[4749]: I1129 01:32:16.201376 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg-config-g7cww" Nov 29 01:32:16 crc kubenswrapper[4749]: I1129 01:32:16.201404 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7a00906422a02f94dbd4280fcdf9c134645486e1050c907d7986b79fc8171cf" Nov 29 01:32:16 crc kubenswrapper[4749]: I1129 01:32:16.204019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"b4e612f85ce7e3f4849419fc2aed4f1ce08e272769ae1279cbbea6357590e912"} Nov 29 01:32:16 crc kubenswrapper[4749]: I1129 01:32:16.468388 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9lxvg-config-hfbwh"] Nov 29 01:32:16 crc kubenswrapper[4749]: W1129 01:32:16.469475 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf496338b_ca41_4f60_9411_8d7b8b6672c6.slice/crio-2dd7f5b3c6933497996ada33ecdeb75e1165f31447e61a0d6d3def8380c65f66 WatchSource:0}: Error finding container 2dd7f5b3c6933497996ada33ecdeb75e1165f31447e61a0d6d3def8380c65f66: Status 404 returned error can't find the container with id 2dd7f5b3c6933497996ada33ecdeb75e1165f31447e61a0d6d3def8380c65f66 Nov 29 01:32:17 crc kubenswrapper[4749]: I1129 01:32:17.090534 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6823cc6a-6938-4a99-96df-14b90dfd29d7" path="/var/lib/kubelet/pods/6823cc6a-6938-4a99-96df-14b90dfd29d7/volumes" Nov 29 01:32:17 crc kubenswrapper[4749]: I1129 01:32:17.214284 4749 generic.go:334] "Generic (PLEG): container finished" podID="f496338b-ca41-4f60-9411-8d7b8b6672c6" containerID="99d208eb4c50e726cc807bb0e460337bfaea3b2f788b14b09f5f549cd368568d" exitCode=0 Nov 29 01:32:17 crc kubenswrapper[4749]: I1129 01:32:17.214333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg-config-hfbwh" event={"ID":"f496338b-ca41-4f60-9411-8d7b8b6672c6","Type":"ContainerDied","Data":"99d208eb4c50e726cc807bb0e460337bfaea3b2f788b14b09f5f549cd368568d"} Nov 29 01:32:17 crc kubenswrapper[4749]: I1129 01:32:17.214363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg-config-hfbwh" event={"ID":"f496338b-ca41-4f60-9411-8d7b8b6672c6","Type":"ContainerStarted","Data":"2dd7f5b3c6933497996ada33ecdeb75e1165f31447e61a0d6d3def8380c65f66"} Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.225405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"c6544b18ba1c97f83dddee9f06c12698f0b180173fc59826441ce3ebe0d76ccb"} Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.225801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"f9bfeddd31df51ff31ba3def5c4f3f2e8ba2fc1efc9f023e76e32fba61e40263"} Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.225813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"e28b38f1e7521875f74819f009a7406efa1e468522d52a3aae45ded543cc2908"} Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.225822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"019a47cff718db2d0002a6650c9afb4a966ae40b1aef5de5b8fac16e7100d973"} Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.563865 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.655631 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-scripts\") pod \"f496338b-ca41-4f60-9411-8d7b8b6672c6\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.655738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-additional-scripts\") pod \"f496338b-ca41-4f60-9411-8d7b8b6672c6\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.655807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run\") pod \"f496338b-ca41-4f60-9411-8d7b8b6672c6\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.655831 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-log-ovn\") pod \"f496338b-ca41-4f60-9411-8d7b8b6672c6\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.655892 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d25gb\" (UniqueName: \"kubernetes.io/projected/f496338b-ca41-4f60-9411-8d7b8b6672c6-kube-api-access-d25gb\") pod \"f496338b-ca41-4f60-9411-8d7b8b6672c6\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.655935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run-ovn\") pod \"f496338b-ca41-4f60-9411-8d7b8b6672c6\" (UID: \"f496338b-ca41-4f60-9411-8d7b8b6672c6\") " Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.655927 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run" (OuterVolumeSpecName: "var-run") pod "f496338b-ca41-4f60-9411-8d7b8b6672c6" (UID: "f496338b-ca41-4f60-9411-8d7b8b6672c6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.655990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f496338b-ca41-4f60-9411-8d7b8b6672c6" (UID: "f496338b-ca41-4f60-9411-8d7b8b6672c6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.656289 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f496338b-ca41-4f60-9411-8d7b8b6672c6" (UID: "f496338b-ca41-4f60-9411-8d7b8b6672c6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.656382 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.656402 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.656788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f496338b-ca41-4f60-9411-8d7b8b6672c6" (UID: "f496338b-ca41-4f60-9411-8d7b8b6672c6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.657468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-scripts" (OuterVolumeSpecName: "scripts") pod "f496338b-ca41-4f60-9411-8d7b8b6672c6" (UID: "f496338b-ca41-4f60-9411-8d7b8b6672c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.663395 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f496338b-ca41-4f60-9411-8d7b8b6672c6-kube-api-access-d25gb" (OuterVolumeSpecName: "kube-api-access-d25gb") pod "f496338b-ca41-4f60-9411-8d7b8b6672c6" (UID: "f496338b-ca41-4f60-9411-8d7b8b6672c6"). InnerVolumeSpecName "kube-api-access-d25gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.757961 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d25gb\" (UniqueName: \"kubernetes.io/projected/f496338b-ca41-4f60-9411-8d7b8b6672c6-kube-api-access-d25gb\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.758001 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f496338b-ca41-4f60-9411-8d7b8b6672c6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.758015 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:18 crc kubenswrapper[4749]: I1129 01:32:18.758027 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f496338b-ca41-4f60-9411-8d7b8b6672c6-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:19 crc kubenswrapper[4749]: I1129 01:32:19.236431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg-config-hfbwh" event={"ID":"f496338b-ca41-4f60-9411-8d7b8b6672c6","Type":"ContainerDied","Data":"2dd7f5b3c6933497996ada33ecdeb75e1165f31447e61a0d6d3def8380c65f66"} Nov 29 01:32:19 crc kubenswrapper[4749]: I1129 01:32:19.236839 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dd7f5b3c6933497996ada33ecdeb75e1165f31447e61a0d6d3def8380c65f66" Nov 29 01:32:19 crc kubenswrapper[4749]: I1129 01:32:19.236496 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg-config-hfbwh" Nov 29 01:32:19 crc kubenswrapper[4749]: I1129 01:32:19.658580 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9lxvg-config-hfbwh"] Nov 29 01:32:19 crc kubenswrapper[4749]: I1129 01:32:19.666634 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9lxvg-config-hfbwh"] Nov 29 01:32:21 crc kubenswrapper[4749]: I1129 01:32:21.088286 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f496338b-ca41-4f60-9411-8d7b8b6672c6" path="/var/lib/kubelet/pods/f496338b-ca41-4f60-9411-8d7b8b6672c6/volumes" Nov 29 01:32:22 crc kubenswrapper[4749]: I1129 01:32:22.269952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"5cd4b93f19f43d8c22124c663685c2a5b7c3218c0e81bb3c102046e182dec0d8"} Nov 29 01:32:22 crc kubenswrapper[4749]: I1129 01:32:22.270179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"e391c22d2e560a1daaa903407899dba8b65a96672ec1d83469d63ccacf47014a"} Nov 29 01:32:24 crc kubenswrapper[4749]: I1129 01:32:24.297251 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"c135e668a9588d4d539c1f9ecce850efa84d0c9cf2ace29a444e0f2d6c4d4e1d"} Nov 29 01:32:25 crc kubenswrapper[4749]: I1129 01:32:25.309636 4749 generic.go:334] "Generic (PLEG): container finished" podID="a76ca241-80b1-4019-b42a-12ecc908016c" containerID="9f887fa6c5c95bfb17f5897d64af04248c64f3261379834f006b8fcb67264158" exitCode=0 Nov 29 01:32:25 crc kubenswrapper[4749]: I1129 01:32:25.309762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mz2m2" event={"ID":"a76ca241-80b1-4019-b42a-12ecc908016c","Type":"ContainerDied","Data":"9f887fa6c5c95bfb17f5897d64af04248c64f3261379834f006b8fcb67264158"} Nov 29 01:32:25 crc kubenswrapper[4749]: I1129 01:32:25.320658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"9f220385e1bef364c02962b269f0e2db7e6230aaf50d37f52f736cd72a640af1"} Nov 29 01:32:25 crc kubenswrapper[4749]: I1129 01:32:25.374112 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:32:25 crc kubenswrapper[4749]: I1129 01:32:25.374247 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.262455 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.336302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"e9c55bc5cd269128e765337cd53c4eb2aa665fc0070c9dbb2cddb9feb42c9d56"} Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.336340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"2153b28d7b1ae703a650e64e126179253f7846999b9a6400cfd009a599bbb246"} Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.336350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"bfafc82b272a3145020f82bc16f80a2708db4958f657da2553e53da41914e8a6"} Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.575464 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.600833 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-74cnm"] Nov 29 01:32:26 crc kubenswrapper[4749]: E1129 01:32:26.605406 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f496338b-ca41-4f60-9411-8d7b8b6672c6" containerName="ovn-config" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.605432 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f496338b-ca41-4f60-9411-8d7b8b6672c6" containerName="ovn-config" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.605703 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f496338b-ca41-4f60-9411-8d7b8b6672c6" containerName="ovn-config" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.606454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.675103 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-74cnm"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.713774 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jq4xn"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.719158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.741360 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jq4xn"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.747531 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e578-account-create-update-jwhdh"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.753550 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.760653 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.788476 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c744d6e-38ea-451d-abe1-03208a580698-operator-scripts\") pod \"cinder-db-create-74cnm\" (UID: \"3c744d6e-38ea-451d-abe1-03208a580698\") " pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.788563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6wb\" (UniqueName: \"kubernetes.io/projected/3c744d6e-38ea-451d-abe1-03208a580698-kube-api-access-sf6wb\") pod \"cinder-db-create-74cnm\" (UID: \"3c744d6e-38ea-451d-abe1-03208a580698\") " pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.881311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e578-account-create-update-jwhdh"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.891125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6wb\" (UniqueName: \"kubernetes.io/projected/3c744d6e-38ea-451d-abe1-03208a580698-kube-api-access-sf6wb\") pod \"cinder-db-create-74cnm\" (UID: \"3c744d6e-38ea-451d-abe1-03208a580698\") " pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.891275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bk6f\" (UniqueName: \"kubernetes.io/projected/079e1231-eb4d-4e9b-b265-f1fd17be981c-kube-api-access-4bk6f\") pod \"cinder-e578-account-create-update-jwhdh\" (UID: \"079e1231-eb4d-4e9b-b265-f1fd17be981c\") " pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.891314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kplzq\" (UniqueName: \"kubernetes.io/projected/fb744c9f-38bf-4a4a-8725-c917921e58c7-kube-api-access-kplzq\") pod \"barbican-db-create-jq4xn\" (UID: \"fb744c9f-38bf-4a4a-8725-c917921e58c7\") " pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.891338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb744c9f-38bf-4a4a-8725-c917921e58c7-operator-scripts\") pod \"barbican-db-create-jq4xn\" (UID: \"fb744c9f-38bf-4a4a-8725-c917921e58c7\") " pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.891371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c744d6e-38ea-451d-abe1-03208a580698-operator-scripts\") pod \"cinder-db-create-74cnm\" (UID: \"3c744d6e-38ea-451d-abe1-03208a580698\") " pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.891393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079e1231-eb4d-4e9b-b265-f1fd17be981c-operator-scripts\") pod \"cinder-e578-account-create-update-jwhdh\" (UID: \"079e1231-eb4d-4e9b-b265-f1fd17be981c\") " pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.892443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c744d6e-38ea-451d-abe1-03208a580698-operator-scripts\") pod \"cinder-db-create-74cnm\" (UID: \"3c744d6e-38ea-451d-abe1-03208a580698\") " pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.933091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6wb\" (UniqueName: \"kubernetes.io/projected/3c744d6e-38ea-451d-abe1-03208a580698-kube-api-access-sf6wb\") pod \"cinder-db-create-74cnm\" (UID: \"3c744d6e-38ea-451d-abe1-03208a580698\") " pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.977257 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-7ea4-account-create-update-x94pt"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.978392 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.983943 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7ea4-account-create-update-x94pt"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.997995 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.998269 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8kwj6"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.999189 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8kwj6"] Nov 29 01:32:26 crc kubenswrapper[4749]: I1129 01:32:26.999280 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.000285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bk6f\" (UniqueName: \"kubernetes.io/projected/079e1231-eb4d-4e9b-b265-f1fd17be981c-kube-api-access-4bk6f\") pod \"cinder-e578-account-create-update-jwhdh\" (UID: \"079e1231-eb4d-4e9b-b265-f1fd17be981c\") " pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.000330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kplzq\" (UniqueName: \"kubernetes.io/projected/fb744c9f-38bf-4a4a-8725-c917921e58c7-kube-api-access-kplzq\") pod \"barbican-db-create-jq4xn\" (UID: \"fb744c9f-38bf-4a4a-8725-c917921e58c7\") " pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.000355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb744c9f-38bf-4a4a-8725-c917921e58c7-operator-scripts\") pod \"barbican-db-create-jq4xn\" (UID: \"fb744c9f-38bf-4a4a-8725-c917921e58c7\") " pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.000392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079e1231-eb4d-4e9b-b265-f1fd17be981c-operator-scripts\") pod \"cinder-e578-account-create-update-jwhdh\" (UID: \"079e1231-eb4d-4e9b-b265-f1fd17be981c\") " pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.001136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079e1231-eb4d-4e9b-b265-f1fd17be981c-operator-scripts\") pod \"cinder-e578-account-create-update-jwhdh\" (UID: \"079e1231-eb4d-4e9b-b265-f1fd17be981c\") " pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.002128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb744c9f-38bf-4a4a-8725-c917921e58c7-operator-scripts\") pod \"barbican-db-create-jq4xn\" (UID: \"fb744c9f-38bf-4a4a-8725-c917921e58c7\") " pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.049835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bk6f\" (UniqueName: \"kubernetes.io/projected/079e1231-eb4d-4e9b-b265-f1fd17be981c-kube-api-access-4bk6f\") pod \"cinder-e578-account-create-update-jwhdh\" (UID: \"079e1231-eb4d-4e9b-b265-f1fd17be981c\") " pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.060246 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5123-account-create-update-2hbcf"] Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.061922 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.065619 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.074776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kplzq\" (UniqueName: \"kubernetes.io/projected/fb744c9f-38bf-4a4a-8725-c917921e58c7-kube-api-access-kplzq\") pod \"barbican-db-create-jq4xn\" (UID: \"fb744c9f-38bf-4a4a-8725-c917921e58c7\") " pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.101535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-operator-scripts\") pod \"barbican-7ea4-account-create-update-x94pt\" (UID: \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\") " pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.101582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-operator-scripts\") pod \"neutron-db-create-8kwj6\" (UID: \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\") " pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.101619 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82gt2\" (UniqueName: \"kubernetes.io/projected/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-kube-api-access-82gt2\") pod \"barbican-7ea4-account-create-update-x94pt\" (UID: \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\") " pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.101649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfkt\" (UniqueName: \"kubernetes.io/projected/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-kube-api-access-ggfkt\") pod \"neutron-db-create-8kwj6\" (UID: \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\") " pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.121812 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vzbzm"] Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.122818 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.127483 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.127687 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.127857 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5mw78" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.127959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.133285 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5123-account-create-update-2hbcf"] Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.144974 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.170961 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vzbzm"] Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.171647 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.206833 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfkt\" (UniqueName: \"kubernetes.io/projected/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-kube-api-access-ggfkt\") pod \"neutron-db-create-8kwj6\" (UID: \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\") " pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.206927 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npgbx\" (UniqueName: \"kubernetes.io/projected/ab60b96f-60f6-436a-be55-f1d0edc65b01-kube-api-access-npgbx\") pod \"neutron-5123-account-create-update-2hbcf\" (UID: \"ab60b96f-60f6-436a-be55-f1d0edc65b01\") " pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.206952 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab60b96f-60f6-436a-be55-f1d0edc65b01-operator-scripts\") pod \"neutron-5123-account-create-update-2hbcf\" (UID: \"ab60b96f-60f6-436a-be55-f1d0edc65b01\") " pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.206975 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-operator-scripts\") pod \"barbican-7ea4-account-create-update-x94pt\" (UID: \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\") " pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.207003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-operator-scripts\") pod \"neutron-db-create-8kwj6\" (UID: \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\") " pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.207040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82gt2\" (UniqueName: \"kubernetes.io/projected/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-kube-api-access-82gt2\") pod \"barbican-7ea4-account-create-update-x94pt\" (UID: \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\") " pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.208148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-operator-scripts\") pod \"barbican-7ea4-account-create-update-x94pt\" (UID: \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\") " pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.208624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-operator-scripts\") pod \"neutron-db-create-8kwj6\" (UID: \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\") " pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.226543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.237816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfkt\" (UniqueName: \"kubernetes.io/projected/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-kube-api-access-ggfkt\") pod \"neutron-db-create-8kwj6\" (UID: \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\") " pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.264491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82gt2\" (UniqueName: \"kubernetes.io/projected/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-kube-api-access-82gt2\") pod \"barbican-7ea4-account-create-update-x94pt\" (UID: \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\") " pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.307696 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-config-data\") pod \"a76ca241-80b1-4019-b42a-12ecc908016c\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.307850 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-combined-ca-bundle\") pod \"a76ca241-80b1-4019-b42a-12ecc908016c\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.307909 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqx8p\" (UniqueName: \"kubernetes.io/projected/a76ca241-80b1-4019-b42a-12ecc908016c-kube-api-access-qqx8p\") pod \"a76ca241-80b1-4019-b42a-12ecc908016c\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.307992 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-db-sync-config-data\") pod \"a76ca241-80b1-4019-b42a-12ecc908016c\" (UID: \"a76ca241-80b1-4019-b42a-12ecc908016c\") " Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.308162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-config-data\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.308196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-combined-ca-bundle\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.308323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npgbx\" (UniqueName: \"kubernetes.io/projected/ab60b96f-60f6-436a-be55-f1d0edc65b01-kube-api-access-npgbx\") pod \"neutron-5123-account-create-update-2hbcf\" (UID: \"ab60b96f-60f6-436a-be55-f1d0edc65b01\") " pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.308344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab60b96f-60f6-436a-be55-f1d0edc65b01-operator-scripts\") pod \"neutron-5123-account-create-update-2hbcf\" (UID: \"ab60b96f-60f6-436a-be55-f1d0edc65b01\") " pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.308366 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g9pw\" (UniqueName: \"kubernetes.io/projected/9bff88ad-b577-46b9-8bf4-4328b3684b6d-kube-api-access-6g9pw\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.312288 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab60b96f-60f6-436a-be55-f1d0edc65b01-operator-scripts\") pod \"neutron-5123-account-create-update-2hbcf\" (UID: \"ab60b96f-60f6-436a-be55-f1d0edc65b01\") " pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.324396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.352412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a76ca241-80b1-4019-b42a-12ecc908016c-kube-api-access-qqx8p" (OuterVolumeSpecName: "kube-api-access-qqx8p") pod "a76ca241-80b1-4019-b42a-12ecc908016c" (UID: "a76ca241-80b1-4019-b42a-12ecc908016c"). InnerVolumeSpecName "kube-api-access-qqx8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.359458 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a76ca241-80b1-4019-b42a-12ecc908016c" (UID: "a76ca241-80b1-4019-b42a-12ecc908016c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.360595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npgbx\" (UniqueName: \"kubernetes.io/projected/ab60b96f-60f6-436a-be55-f1d0edc65b01-kube-api-access-npgbx\") pod \"neutron-5123-account-create-update-2hbcf\" (UID: \"ab60b96f-60f6-436a-be55-f1d0edc65b01\") " pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.363312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a76ca241-80b1-4019-b42a-12ecc908016c" (UID: "a76ca241-80b1-4019-b42a-12ecc908016c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.375618 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.393513 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"280811dcf44a523f325934a8c2570ea8ae0d344113018439d508fa6efc5324ec"} Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.393561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"a78d3a0a66909398264af9304c9055f1ddb5c500bf5846177971f56a8ebc2113"} Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.399396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-config-data" (OuterVolumeSpecName: "config-data") pod "a76ca241-80b1-4019-b42a-12ecc908016c" (UID: "a76ca241-80b1-4019-b42a-12ecc908016c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.406426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mz2m2" event={"ID":"a76ca241-80b1-4019-b42a-12ecc908016c","Type":"ContainerDied","Data":"c6e8cb192ab63c3d889849caf339d856ab7022feedd7c7f323a92714994dc1c3"} Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.406458 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6e8cb192ab63c3d889849caf339d856ab7022feedd7c7f323a92714994dc1c3" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.406518 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mz2m2" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.410790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-config-data\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.410840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-combined-ca-bundle\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.410958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g9pw\" (UniqueName: \"kubernetes.io/projected/9bff88ad-b577-46b9-8bf4-4328b3684b6d-kube-api-access-6g9pw\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.411005 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.411015 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqx8p\" (UniqueName: \"kubernetes.io/projected/a76ca241-80b1-4019-b42a-12ecc908016c-kube-api-access-qqx8p\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.411026 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.411034 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a76ca241-80b1-4019-b42a-12ecc908016c-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.439529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-combined-ca-bundle\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.442801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-config-data\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.445191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g9pw\" (UniqueName: \"kubernetes.io/projected/9bff88ad-b577-46b9-8bf4-4328b3684b6d-kube-api-access-6g9pw\") pod \"keystone-db-sync-vzbzm\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.459295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.528661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.553438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.840550 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e578-account-create-update-jwhdh"] Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.865141 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cpm98"] Nov 29 01:32:27 crc kubenswrapper[4749]: E1129 01:32:27.865596 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a76ca241-80b1-4019-b42a-12ecc908016c" containerName="glance-db-sync" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.865609 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a76ca241-80b1-4019-b42a-12ecc908016c" containerName="glance-db-sync" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.865788 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a76ca241-80b1-4019-b42a-12ecc908016c" containerName="glance-db-sync" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.866639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.882639 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cpm98"] Nov 29 01:32:27 crc kubenswrapper[4749]: I1129 01:32:27.956137 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-74cnm"] Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.032980 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-config\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.033068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.033096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.033147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.033192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvfhp\" (UniqueName: \"kubernetes.io/projected/59b5efa2-32c6-4726-b09c-e84f58fd4a70-kube-api-access-tvfhp\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.143902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvfhp\" (UniqueName: \"kubernetes.io/projected/59b5efa2-32c6-4726-b09c-e84f58fd4a70-kube-api-access-tvfhp\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.144104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-config\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.144230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.144261 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.144350 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.145858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-config\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.146773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.147358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.162652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.186512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvfhp\" (UniqueName: \"kubernetes.io/projected/59b5efa2-32c6-4726-b09c-e84f58fd4a70-kube-api-access-tvfhp\") pod \"dnsmasq-dns-5b946c75cc-cpm98\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.286664 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jq4xn"] Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.301666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7ea4-account-create-update-x94pt"] Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.325245 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.414400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jq4xn" event={"ID":"fb744c9f-38bf-4a4a-8725-c917921e58c7","Type":"ContainerStarted","Data":"5f17c14b36fe93f49969d2ef6c54ad65e1419bbdf618e54e648afcfc209d89fb"} Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.415624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e578-account-create-update-jwhdh" event={"ID":"079e1231-eb4d-4e9b-b265-f1fd17be981c","Type":"ContainerStarted","Data":"1c3f1dfce598ef9b86d1af6c3f250556b2470fb0d92e58c0d661d554b6eca798"} Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.415649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e578-account-create-update-jwhdh" event={"ID":"079e1231-eb4d-4e9b-b265-f1fd17be981c","Type":"ContainerStarted","Data":"5dddec1fa0a21948ab559f65b7ab2531a581df654613241991f7c7de4916f9ff"} Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.420833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7ea4-account-create-update-x94pt" event={"ID":"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494","Type":"ContainerStarted","Data":"a733604766ecf181bd8cba3b13e3e77e8daa6bb85866f0200f558061da6bd758"} Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.421812 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-74cnm" event={"ID":"3c744d6e-38ea-451d-abe1-03208a580698","Type":"ContainerStarted","Data":"f7d18dd171d151bb5f7f072cb26eda7543c6d923103ab1c8ac9ee90af97d0e8b"} Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.421830 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-74cnm" event={"ID":"3c744d6e-38ea-451d-abe1-03208a580698","Type":"ContainerStarted","Data":"d2b42527bf226455a28bd63ef4e269164ea0e0485b81632ccdf0abd0198713be"} Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.440520 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-e578-account-create-update-jwhdh" podStartSLOduration=2.440500728 podStartE2EDuration="2.440500728s" podCreationTimestamp="2025-11-29 01:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:28.438113179 +0000 UTC m=+1291.610263046" watchObservedRunningTime="2025-11-29 01:32:28.440500728 +0000 UTC m=+1291.612650585" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.446241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"894dac0e55c5b8303d062dc9a73ff359863207502ac50a567d5a516b5044255f"} Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.446274 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerStarted","Data":"efa94720f5ca79ab7d9121540b501d8e5d9310c95e8baf7e219e6f4cee72dadd"} Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.492746 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-74cnm" podStartSLOduration=2.492727172 podStartE2EDuration="2.492727172s" podCreationTimestamp="2025-11-29 01:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:28.459011417 +0000 UTC m=+1291.631161284" watchObservedRunningTime="2025-11-29 01:32:28.492727172 +0000 UTC m=+1291.664877029" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.523287 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.681605177 podStartE2EDuration="47.523270408s" podCreationTimestamp="2025-11-29 01:31:41 +0000 UTC" firstStartedPulling="2025-11-29 01:32:15.716556657 +0000 UTC m=+1278.888706514" lastFinishedPulling="2025-11-29 01:32:25.558221888 +0000 UTC m=+1288.730371745" observedRunningTime="2025-11-29 01:32:28.49267332 +0000 UTC m=+1291.664823187" watchObservedRunningTime="2025-11-29 01:32:28.523270408 +0000 UTC m=+1291.695420255" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.531244 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8kwj6"] Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.537420 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5123-account-create-update-2hbcf"] Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.667096 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vzbzm"] Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.883866 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cpm98"] Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.917846 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6npfw"] Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.919611 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.922508 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 29 01:32:28 crc kubenswrapper[4749]: I1129 01:32:28.933111 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6npfw"] Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.013512 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cpm98"] Nov 29 01:32:29 crc kubenswrapper[4749]: W1129 01:32:29.051567 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59b5efa2_32c6_4726_b09c_e84f58fd4a70.slice/crio-d54b5bdd7d2b8990e4ace4c329f1632aa5bb1c166556ea935b4f27c26e60a575 WatchSource:0}: Error finding container d54b5bdd7d2b8990e4ace4c329f1632aa5bb1c166556ea935b4f27c26e60a575: Status 404 returned error can't find the container with id d54b5bdd7d2b8990e4ace4c329f1632aa5bb1c166556ea935b4f27c26e60a575 Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.077688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.077795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.077832 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.077880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.077927 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-config\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.077991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rks\" (UniqueName: \"kubernetes.io/projected/2b8e5b07-b7b6-4f46-81bf-5229500201ca-kube-api-access-b7rks\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.179590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7rks\" (UniqueName: \"kubernetes.io/projected/2b8e5b07-b7b6-4f46-81bf-5229500201ca-kube-api-access-b7rks\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.179966 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.180012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.180036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.180076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.180113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-config\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.180940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-config\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.181859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.182385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.183036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.183975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.197628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7rks\" (UniqueName: \"kubernetes.io/projected/2b8e5b07-b7b6-4f46-81bf-5229500201ca-kube-api-access-b7rks\") pod \"dnsmasq-dns-74f6bcbc87-6npfw\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.251460 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.462350 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vzbzm" event={"ID":"9bff88ad-b577-46b9-8bf4-4328b3684b6d","Type":"ContainerStarted","Data":"7dc790c9b29b34961491aa340de426c4bc6b3b112a9f3e005d64911bb9f9a51c"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.470112 4749 generic.go:334] "Generic (PLEG): container finished" podID="fb744c9f-38bf-4a4a-8725-c917921e58c7" containerID="a8c8a713e75709f52b35c805603adbd78c58b5ef7e39c57d1ccf8bbf1b1524a2" exitCode=0 Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.470185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jq4xn" event={"ID":"fb744c9f-38bf-4a4a-8725-c917921e58c7","Type":"ContainerDied","Data":"a8c8a713e75709f52b35c805603adbd78c58b5ef7e39c57d1ccf8bbf1b1524a2"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.474239 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" event={"ID":"59b5efa2-32c6-4726-b09c-e84f58fd4a70","Type":"ContainerStarted","Data":"6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.474271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" event={"ID":"59b5efa2-32c6-4726-b09c-e84f58fd4a70","Type":"ContainerStarted","Data":"d54b5bdd7d2b8990e4ace4c329f1632aa5bb1c166556ea935b4f27c26e60a575"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.474369 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" podUID="59b5efa2-32c6-4726-b09c-e84f58fd4a70" containerName="init" containerID="cri-o://6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085" gracePeriod=10 Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.480374 4749 generic.go:334] "Generic (PLEG): container finished" podID="079e1231-eb4d-4e9b-b265-f1fd17be981c" containerID="1c3f1dfce598ef9b86d1af6c3f250556b2470fb0d92e58c0d661d554b6eca798" exitCode=0 Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.480431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e578-account-create-update-jwhdh" event={"ID":"079e1231-eb4d-4e9b-b265-f1fd17be981c","Type":"ContainerDied","Data":"1c3f1dfce598ef9b86d1af6c3f250556b2470fb0d92e58c0d661d554b6eca798"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.482977 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ca66ccd-e7ca-4cbc-84f7-5acafa38d494" containerID="63710a4616bf5617142071918ea3fd7b7d08dccb9a14eb02d7c475c5731d8bc3" exitCode=0 Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.483036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7ea4-account-create-update-x94pt" event={"ID":"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494","Type":"ContainerDied","Data":"63710a4616bf5617142071918ea3fd7b7d08dccb9a14eb02d7c475c5731d8bc3"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.483943 4749 generic.go:334] "Generic (PLEG): container finished" podID="f53cda25-6fa5-4ec3-bf8f-686c8619e97f" containerID="f861eef455efe7c4d77d157775bb326fa3614266dfd5ecbfd8147c59729d017f" exitCode=0 Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.483984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8kwj6" event={"ID":"f53cda25-6fa5-4ec3-bf8f-686c8619e97f","Type":"ContainerDied","Data":"f861eef455efe7c4d77d157775bb326fa3614266dfd5ecbfd8147c59729d017f"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.483999 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8kwj6" event={"ID":"f53cda25-6fa5-4ec3-bf8f-686c8619e97f","Type":"ContainerStarted","Data":"00a0bb5d99332571da508dcfbd6866055dadf71f26c16d0a356ed281c53e68dc"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.487862 4749 generic.go:334] "Generic (PLEG): container finished" podID="3c744d6e-38ea-451d-abe1-03208a580698" containerID="f7d18dd171d151bb5f7f072cb26eda7543c6d923103ab1c8ac9ee90af97d0e8b" exitCode=0 Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.487936 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-74cnm" event={"ID":"3c744d6e-38ea-451d-abe1-03208a580698","Type":"ContainerDied","Data":"f7d18dd171d151bb5f7f072cb26eda7543c6d923103ab1c8ac9ee90af97d0e8b"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.496005 4749 generic.go:334] "Generic (PLEG): container finished" podID="ab60b96f-60f6-436a-be55-f1d0edc65b01" containerID="3e486a43c1b894bc08428cc6a8f5fcc03bc72bb1a25379d9061ae2fe460693e9" exitCode=0 Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.497489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5123-account-create-update-2hbcf" event={"ID":"ab60b96f-60f6-436a-be55-f1d0edc65b01","Type":"ContainerDied","Data":"3e486a43c1b894bc08428cc6a8f5fcc03bc72bb1a25379d9061ae2fe460693e9"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.497530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5123-account-create-update-2hbcf" event={"ID":"ab60b96f-60f6-436a-be55-f1d0edc65b01","Type":"ContainerStarted","Data":"3a37fcfe30a4441cdb25c6bcc97522e803741390dc8ac08f13da7e7193767ae3"} Nov 29 01:32:29 crc kubenswrapper[4749]: I1129 01:32:29.601491 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6npfw"] Nov 29 01:32:29 crc kubenswrapper[4749]: W1129 01:32:29.616507 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b8e5b07_b7b6_4f46_81bf_5229500201ca.slice/crio-5e7ce0a4cd1e42ffb00de6514596ceb511d7cb9827f36fa6f47db3ea70e490e1 WatchSource:0}: Error finding container 5e7ce0a4cd1e42ffb00de6514596ceb511d7cb9827f36fa6f47db3ea70e490e1: Status 404 returned error can't find the container with id 5e7ce0a4cd1e42ffb00de6514596ceb511d7cb9827f36fa6f47db3ea70e490e1 Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.041925 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.097060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvfhp\" (UniqueName: \"kubernetes.io/projected/59b5efa2-32c6-4726-b09c-e84f58fd4a70-kube-api-access-tvfhp\") pod \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.097101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-nb\") pod \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.097160 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-config\") pod \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.097302 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-sb\") pod \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.097409 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-dns-svc\") pod \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\" (UID: \"59b5efa2-32c6-4726-b09c-e84f58fd4a70\") " Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.111375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b5efa2-32c6-4726-b09c-e84f58fd4a70-kube-api-access-tvfhp" (OuterVolumeSpecName: "kube-api-access-tvfhp") pod "59b5efa2-32c6-4726-b09c-e84f58fd4a70" (UID: "59b5efa2-32c6-4726-b09c-e84f58fd4a70"). InnerVolumeSpecName "kube-api-access-tvfhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.124074 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59b5efa2-32c6-4726-b09c-e84f58fd4a70" (UID: "59b5efa2-32c6-4726-b09c-e84f58fd4a70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.128744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59b5efa2-32c6-4726-b09c-e84f58fd4a70" (UID: "59b5efa2-32c6-4726-b09c-e84f58fd4a70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.131507 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59b5efa2-32c6-4726-b09c-e84f58fd4a70" (UID: "59b5efa2-32c6-4726-b09c-e84f58fd4a70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.153795 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-config" (OuterVolumeSpecName: "config") pod "59b5efa2-32c6-4726-b09c-e84f58fd4a70" (UID: "59b5efa2-32c6-4726-b09c-e84f58fd4a70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.199562 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.199595 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvfhp\" (UniqueName: \"kubernetes.io/projected/59b5efa2-32c6-4726-b09c-e84f58fd4a70-kube-api-access-tvfhp\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.199604 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.199613 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.199623 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59b5efa2-32c6-4726-b09c-e84f58fd4a70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.506151 4749 generic.go:334] "Generic (PLEG): container finished" podID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerID="da2b71583938cd818f6cb4c15d838cd882df14adc0c7852627aff51e6faaa49c" exitCode=0 Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.506259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" event={"ID":"2b8e5b07-b7b6-4f46-81bf-5229500201ca","Type":"ContainerDied","Data":"da2b71583938cd818f6cb4c15d838cd882df14adc0c7852627aff51e6faaa49c"} Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.506317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" event={"ID":"2b8e5b07-b7b6-4f46-81bf-5229500201ca","Type":"ContainerStarted","Data":"5e7ce0a4cd1e42ffb00de6514596ceb511d7cb9827f36fa6f47db3ea70e490e1"} Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.511138 4749 generic.go:334] "Generic (PLEG): container finished" podID="59b5efa2-32c6-4726-b09c-e84f58fd4a70" containerID="6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085" exitCode=0 Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.511239 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" event={"ID":"59b5efa2-32c6-4726-b09c-e84f58fd4a70","Type":"ContainerDied","Data":"6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085"} Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.511295 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" event={"ID":"59b5efa2-32c6-4726-b09c-e84f58fd4a70","Type":"ContainerDied","Data":"d54b5bdd7d2b8990e4ace4c329f1632aa5bb1c166556ea935b4f27c26e60a575"} Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.511316 4749 scope.go:117] "RemoveContainer" containerID="6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.511434 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-cpm98" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.590116 4749 scope.go:117] "RemoveContainer" containerID="6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.598417 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cpm98"] Nov 29 01:32:30 crc kubenswrapper[4749]: E1129 01:32:30.602011 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085\": container with ID starting with 6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085 not found: ID does not exist" containerID="6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.602054 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085"} err="failed to get container status \"6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085\": rpc error: code = NotFound desc = could not find container \"6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085\": container with ID starting with 6ef50aba57497a568cb792afbd8c4a3df832ba65eb2a4a016f39749cc96b9085 not found: ID does not exist" Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.625988 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-cpm98"] Nov 29 01:32:30 crc kubenswrapper[4749]: I1129 01:32:30.964090 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.017154 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab60b96f-60f6-436a-be55-f1d0edc65b01-operator-scripts\") pod \"ab60b96f-60f6-436a-be55-f1d0edc65b01\" (UID: \"ab60b96f-60f6-436a-be55-f1d0edc65b01\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.017295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npgbx\" (UniqueName: \"kubernetes.io/projected/ab60b96f-60f6-436a-be55-f1d0edc65b01-kube-api-access-npgbx\") pod \"ab60b96f-60f6-436a-be55-f1d0edc65b01\" (UID: \"ab60b96f-60f6-436a-be55-f1d0edc65b01\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.019227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab60b96f-60f6-436a-be55-f1d0edc65b01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ab60b96f-60f6-436a-be55-f1d0edc65b01" (UID: "ab60b96f-60f6-436a-be55-f1d0edc65b01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.040571 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab60b96f-60f6-436a-be55-f1d0edc65b01-kube-api-access-npgbx" (OuterVolumeSpecName: "kube-api-access-npgbx") pod "ab60b96f-60f6-436a-be55-f1d0edc65b01" (UID: "ab60b96f-60f6-436a-be55-f1d0edc65b01"). InnerVolumeSpecName "kube-api-access-npgbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.085580 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b5efa2-32c6-4726-b09c-e84f58fd4a70" path="/var/lib/kubelet/pods/59b5efa2-32c6-4726-b09c-e84f58fd4a70/volumes" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.118900 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab60b96f-60f6-436a-be55-f1d0edc65b01-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.118931 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npgbx\" (UniqueName: \"kubernetes.io/projected/ab60b96f-60f6-436a-be55-f1d0edc65b01-kube-api-access-npgbx\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.213231 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.230008 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.248037 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.248739 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.255954 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kplzq\" (UniqueName: \"kubernetes.io/projected/fb744c9f-38bf-4a4a-8725-c917921e58c7-kube-api-access-kplzq\") pod \"fb744c9f-38bf-4a4a-8725-c917921e58c7\" (UID: \"fb744c9f-38bf-4a4a-8725-c917921e58c7\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-operator-scripts\") pod \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\" (UID: \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb744c9f-38bf-4a4a-8725-c917921e58c7-operator-scripts\") pod \"fb744c9f-38bf-4a4a-8725-c917921e58c7\" (UID: \"fb744c9f-38bf-4a4a-8725-c917921e58c7\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321430 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf6wb\" (UniqueName: \"kubernetes.io/projected/3c744d6e-38ea-451d-abe1-03208a580698-kube-api-access-sf6wb\") pod \"3c744d6e-38ea-451d-abe1-03208a580698\" (UID: \"3c744d6e-38ea-451d-abe1-03208a580698\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321507 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079e1231-eb4d-4e9b-b265-f1fd17be981c-operator-scripts\") pod \"079e1231-eb4d-4e9b-b265-f1fd17be981c\" (UID: \"079e1231-eb4d-4e9b-b265-f1fd17be981c\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c744d6e-38ea-451d-abe1-03208a580698-operator-scripts\") pod \"3c744d6e-38ea-451d-abe1-03208a580698\" (UID: \"3c744d6e-38ea-451d-abe1-03208a580698\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bk6f\" (UniqueName: \"kubernetes.io/projected/079e1231-eb4d-4e9b-b265-f1fd17be981c-kube-api-access-4bk6f\") pod \"079e1231-eb4d-4e9b-b265-f1fd17be981c\" (UID: \"079e1231-eb4d-4e9b-b265-f1fd17be981c\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82gt2\" (UniqueName: \"kubernetes.io/projected/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-kube-api-access-82gt2\") pod \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\" (UID: \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggfkt\" (UniqueName: \"kubernetes.io/projected/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-kube-api-access-ggfkt\") pod \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\" (UID: \"f53cda25-6fa5-4ec3-bf8f-686c8619e97f\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-operator-scripts\") pod \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\" (UID: \"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494\") " Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321911 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb744c9f-38bf-4a4a-8725-c917921e58c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb744c9f-38bf-4a4a-8725-c917921e58c7" (UID: "fb744c9f-38bf-4a4a-8725-c917921e58c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.321958 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f53cda25-6fa5-4ec3-bf8f-686c8619e97f" (UID: "f53cda25-6fa5-4ec3-bf8f-686c8619e97f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.322684 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079e1231-eb4d-4e9b-b265-f1fd17be981c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "079e1231-eb4d-4e9b-b265-f1fd17be981c" (UID: "079e1231-eb4d-4e9b-b265-f1fd17be981c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.322761 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ca66ccd-e7ca-4cbc-84f7-5acafa38d494" (UID: "5ca66ccd-e7ca-4cbc-84f7-5acafa38d494"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.323114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c744d6e-38ea-451d-abe1-03208a580698-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c744d6e-38ea-451d-abe1-03208a580698" (UID: "3c744d6e-38ea-451d-abe1-03208a580698"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.323142 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.323163 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb744c9f-38bf-4a4a-8725-c917921e58c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.323175 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/079e1231-eb4d-4e9b-b265-f1fd17be981c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.323187 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.325858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb744c9f-38bf-4a4a-8725-c917921e58c7-kube-api-access-kplzq" (OuterVolumeSpecName: "kube-api-access-kplzq") pod "fb744c9f-38bf-4a4a-8725-c917921e58c7" (UID: "fb744c9f-38bf-4a4a-8725-c917921e58c7"). InnerVolumeSpecName "kube-api-access-kplzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.325894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079e1231-eb4d-4e9b-b265-f1fd17be981c-kube-api-access-4bk6f" (OuterVolumeSpecName: "kube-api-access-4bk6f") pod "079e1231-eb4d-4e9b-b265-f1fd17be981c" (UID: "079e1231-eb4d-4e9b-b265-f1fd17be981c"). InnerVolumeSpecName "kube-api-access-4bk6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.326514 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-kube-api-access-ggfkt" (OuterVolumeSpecName: "kube-api-access-ggfkt") pod "f53cda25-6fa5-4ec3-bf8f-686c8619e97f" (UID: "f53cda25-6fa5-4ec3-bf8f-686c8619e97f"). InnerVolumeSpecName "kube-api-access-ggfkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.326983 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-kube-api-access-82gt2" (OuterVolumeSpecName: "kube-api-access-82gt2") pod "5ca66ccd-e7ca-4cbc-84f7-5acafa38d494" (UID: "5ca66ccd-e7ca-4cbc-84f7-5acafa38d494"). InnerVolumeSpecName "kube-api-access-82gt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.328765 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c744d6e-38ea-451d-abe1-03208a580698-kube-api-access-sf6wb" (OuterVolumeSpecName: "kube-api-access-sf6wb") pod "3c744d6e-38ea-451d-abe1-03208a580698" (UID: "3c744d6e-38ea-451d-abe1-03208a580698"). InnerVolumeSpecName "kube-api-access-sf6wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.425890 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf6wb\" (UniqueName: \"kubernetes.io/projected/3c744d6e-38ea-451d-abe1-03208a580698-kube-api-access-sf6wb\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.426607 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c744d6e-38ea-451d-abe1-03208a580698-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.426726 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bk6f\" (UniqueName: \"kubernetes.io/projected/079e1231-eb4d-4e9b-b265-f1fd17be981c-kube-api-access-4bk6f\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.426833 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82gt2\" (UniqueName: \"kubernetes.io/projected/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494-kube-api-access-82gt2\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.427028 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggfkt\" (UniqueName: \"kubernetes.io/projected/f53cda25-6fa5-4ec3-bf8f-686c8619e97f-kube-api-access-ggfkt\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.427214 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kplzq\" (UniqueName: \"kubernetes.io/projected/fb744c9f-38bf-4a4a-8725-c917921e58c7-kube-api-access-kplzq\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.520400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7ea4-account-create-update-x94pt" event={"ID":"5ca66ccd-e7ca-4cbc-84f7-5acafa38d494","Type":"ContainerDied","Data":"a733604766ecf181bd8cba3b13e3e77e8daa6bb85866f0200f558061da6bd758"} Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.520449 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a733604766ecf181bd8cba3b13e3e77e8daa6bb85866f0200f558061da6bd758" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.520763 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7ea4-account-create-update-x94pt" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.521936 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8kwj6" event={"ID":"f53cda25-6fa5-4ec3-bf8f-686c8619e97f","Type":"ContainerDied","Data":"00a0bb5d99332571da508dcfbd6866055dadf71f26c16d0a356ed281c53e68dc"} Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.521969 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00a0bb5d99332571da508dcfbd6866055dadf71f26c16d0a356ed281c53e68dc" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.522042 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8kwj6" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.524031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-74cnm" event={"ID":"3c744d6e-38ea-451d-abe1-03208a580698","Type":"ContainerDied","Data":"d2b42527bf226455a28bd63ef4e269164ea0e0485b81632ccdf0abd0198713be"} Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.524074 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2b42527bf226455a28bd63ef4e269164ea0e0485b81632ccdf0abd0198713be" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.524142 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-74cnm" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.534745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" event={"ID":"2b8e5b07-b7b6-4f46-81bf-5229500201ca","Type":"ContainerStarted","Data":"194cc581f99d9f64096ae76ff8e5e139c8f7f96baf89a98b338fec5972a28c60"} Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.535008 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.538165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5123-account-create-update-2hbcf" event={"ID":"ab60b96f-60f6-436a-be55-f1d0edc65b01","Type":"ContainerDied","Data":"3a37fcfe30a4441cdb25c6bcc97522e803741390dc8ac08f13da7e7193767ae3"} Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.538287 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a37fcfe30a4441cdb25c6bcc97522e803741390dc8ac08f13da7e7193767ae3" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.538768 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5123-account-create-update-2hbcf" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.543149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jq4xn" event={"ID":"fb744c9f-38bf-4a4a-8725-c917921e58c7","Type":"ContainerDied","Data":"5f17c14b36fe93f49969d2ef6c54ad65e1419bbdf618e54e648afcfc209d89fb"} Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.543357 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f17c14b36fe93f49969d2ef6c54ad65e1419bbdf618e54e648afcfc209d89fb" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.543543 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jq4xn" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.548963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e578-account-create-update-jwhdh" event={"ID":"079e1231-eb4d-4e9b-b265-f1fd17be981c","Type":"ContainerDied","Data":"5dddec1fa0a21948ab559f65b7ab2531a581df654613241991f7c7de4916f9ff"} Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.549005 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dddec1fa0a21948ab559f65b7ab2531a581df654613241991f7c7de4916f9ff" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.549073 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e578-account-create-update-jwhdh" Nov 29 01:32:31 crc kubenswrapper[4749]: I1129 01:32:31.557696 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" podStartSLOduration=3.557677935 podStartE2EDuration="3.557677935s" podCreationTimestamp="2025-11-29 01:32:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:31.55705821 +0000 UTC m=+1294.729208077" watchObservedRunningTime="2025-11-29 01:32:31.557677935 +0000 UTC m=+1294.729827792" Nov 29 01:32:35 crc kubenswrapper[4749]: I1129 01:32:35.598458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vzbzm" event={"ID":"9bff88ad-b577-46b9-8bf4-4328b3684b6d","Type":"ContainerStarted","Data":"0f0213b5ccee6839849d82e20a5d7ee9aa22319de6297d00f42c6ec878d4a759"} Nov 29 01:32:38 crc kubenswrapper[4749]: I1129 01:32:38.626751 4749 generic.go:334] "Generic (PLEG): container finished" podID="9bff88ad-b577-46b9-8bf4-4328b3684b6d" containerID="0f0213b5ccee6839849d82e20a5d7ee9aa22319de6297d00f42c6ec878d4a759" exitCode=0 Nov 29 01:32:38 crc kubenswrapper[4749]: I1129 01:32:38.626820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vzbzm" event={"ID":"9bff88ad-b577-46b9-8bf4-4328b3684b6d","Type":"ContainerDied","Data":"0f0213b5ccee6839849d82e20a5d7ee9aa22319de6297d00f42c6ec878d4a759"} Nov 29 01:32:39 crc kubenswrapper[4749]: I1129 01:32:39.253512 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:32:39 crc kubenswrapper[4749]: I1129 01:32:39.371409 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5c7gw"] Nov 29 01:32:39 crc kubenswrapper[4749]: I1129 01:32:39.371641 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-5c7gw" podUID="cf32a260-17ee-431a-ab31-9b2215b6823f" containerName="dnsmasq-dns" containerID="cri-o://d387c2987e2dc3792bc71d9ecfbf2e6a56b4c44b79c8cc37de7229402aa58215" gracePeriod=10 Nov 29 01:32:39 crc kubenswrapper[4749]: I1129 01:32:39.640359 4749 generic.go:334] "Generic (PLEG): container finished" podID="cf32a260-17ee-431a-ab31-9b2215b6823f" containerID="d387c2987e2dc3792bc71d9ecfbf2e6a56b4c44b79c8cc37de7229402aa58215" exitCode=0 Nov 29 01:32:39 crc kubenswrapper[4749]: I1129 01:32:39.640428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5c7gw" event={"ID":"cf32a260-17ee-431a-ab31-9b2215b6823f","Type":"ContainerDied","Data":"d387c2987e2dc3792bc71d9ecfbf2e6a56b4c44b79c8cc37de7229402aa58215"} Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.048532 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.114593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g9pw\" (UniqueName: \"kubernetes.io/projected/9bff88ad-b577-46b9-8bf4-4328b3684b6d-kube-api-access-6g9pw\") pod \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.114734 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-combined-ca-bundle\") pod \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.114785 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-config-data\") pod \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\" (UID: \"9bff88ad-b577-46b9-8bf4-4328b3684b6d\") " Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.121373 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bff88ad-b577-46b9-8bf4-4328b3684b6d-kube-api-access-6g9pw" (OuterVolumeSpecName: "kube-api-access-6g9pw") pod "9bff88ad-b577-46b9-8bf4-4328b3684b6d" (UID: "9bff88ad-b577-46b9-8bf4-4328b3684b6d"). InnerVolumeSpecName "kube-api-access-6g9pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.146368 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bff88ad-b577-46b9-8bf4-4328b3684b6d" (UID: "9bff88ad-b577-46b9-8bf4-4328b3684b6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.168959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-config-data" (OuterVolumeSpecName: "config-data") pod "9bff88ad-b577-46b9-8bf4-4328b3684b6d" (UID: "9bff88ad-b577-46b9-8bf4-4328b3684b6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.217508 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g9pw\" (UniqueName: \"kubernetes.io/projected/9bff88ad-b577-46b9-8bf4-4328b3684b6d-kube-api-access-6g9pw\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.217620 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.217649 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bff88ad-b577-46b9-8bf4-4328b3684b6d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.651626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vzbzm" event={"ID":"9bff88ad-b577-46b9-8bf4-4328b3684b6d","Type":"ContainerDied","Data":"7dc790c9b29b34961491aa340de426c4bc6b3b112a9f3e005d64911bb9f9a51c"} Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.651669 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dc790c9b29b34961491aa340de426c4bc6b3b112a9f3e005d64911bb9f9a51c" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.651732 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vzbzm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852081 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lq7sm"] Nov 29 01:32:40 crc kubenswrapper[4749]: E1129 01:32:40.852598 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53cda25-6fa5-4ec3-bf8f-686c8619e97f" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53cda25-6fa5-4ec3-bf8f-686c8619e97f" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: E1129 01:32:40.852644 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca66ccd-e7ca-4cbc-84f7-5acafa38d494" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852653 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca66ccd-e7ca-4cbc-84f7-5acafa38d494" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: E1129 01:32:40.852673 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab60b96f-60f6-436a-be55-f1d0edc65b01" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852681 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab60b96f-60f6-436a-be55-f1d0edc65b01" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: E1129 01:32:40.852692 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bff88ad-b577-46b9-8bf4-4328b3684b6d" containerName="keystone-db-sync" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852699 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bff88ad-b577-46b9-8bf4-4328b3684b6d" containerName="keystone-db-sync" Nov 29 01:32:40 crc kubenswrapper[4749]: E1129 01:32:40.852714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb744c9f-38bf-4a4a-8725-c917921e58c7" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852721 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb744c9f-38bf-4a4a-8725-c917921e58c7" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: E1129 01:32:40.852738 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c744d6e-38ea-451d-abe1-03208a580698" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852747 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c744d6e-38ea-451d-abe1-03208a580698" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: E1129 01:32:40.852758 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b5efa2-32c6-4726-b09c-e84f58fd4a70" containerName="init" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852766 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b5efa2-32c6-4726-b09c-e84f58fd4a70" containerName="init" Nov 29 01:32:40 crc kubenswrapper[4749]: E1129 01:32:40.852784 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079e1231-eb4d-4e9b-b265-f1fd17be981c" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852792 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="079e1231-eb4d-4e9b-b265-f1fd17be981c" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.852981 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c744d6e-38ea-451d-abe1-03208a580698" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.853002 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b5efa2-32c6-4726-b09c-e84f58fd4a70" containerName="init" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.853014 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb744c9f-38bf-4a4a-8725-c917921e58c7" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.853027 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab60b96f-60f6-436a-be55-f1d0edc65b01" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.853041 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca66ccd-e7ca-4cbc-84f7-5acafa38d494" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.853049 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="079e1231-eb4d-4e9b-b265-f1fd17be981c" containerName="mariadb-account-create-update" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.853059 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53cda25-6fa5-4ec3-bf8f-686c8619e97f" containerName="mariadb-database-create" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.853068 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bff88ad-b577-46b9-8bf4-4328b3684b6d" containerName="keystone-db-sync" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.854087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.874434 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lq7sm"] Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.909454 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nfmvr"] Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.910582 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.913609 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.913794 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.913914 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5mw78" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.914017 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.914155 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.928359 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.928452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.928499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-config\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.928521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsbbn\" (UniqueName: \"kubernetes.io/projected/9f01aa57-9fab-45c8-b4a0-2b951813e4df-kube-api-access-vsbbn\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.928541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-svc\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.928561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:40 crc kubenswrapper[4749]: I1129 01:32:40.929992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nfmvr"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.029795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-credential-keys\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.029843 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-config-data\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.029893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-combined-ca-bundle\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.029951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.029979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhjv\" (UniqueName: \"kubernetes.io/projected/a78f6390-356b-4502-b065-ce16c7b2cfb3-kube-api-access-kvhjv\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.030021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-scripts\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.030051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-fernet-keys\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.030076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-config\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.030100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsbbn\" (UniqueName: \"kubernetes.io/projected/9f01aa57-9fab-45c8-b4a0-2b951813e4df-kube-api-access-vsbbn\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.030132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-svc\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.030155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.030190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.031360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-config\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.031382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.032146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-svc\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.034227 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.034733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.071299 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsbbn\" (UniqueName: \"kubernetes.io/projected/9f01aa57-9fab-45c8-b4a0-2b951813e4df-kube-api-access-vsbbn\") pod \"dnsmasq-dns-847c4cc679-lq7sm\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.102261 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qtf89"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.103228 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.106096 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qtf89"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.106408 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v6hwt" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.106643 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.107784 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.134092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-combined-ca-bundle\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.134410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhjv\" (UniqueName: \"kubernetes.io/projected/a78f6390-356b-4502-b065-ce16c7b2cfb3-kube-api-access-kvhjv\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.134502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-scripts\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.134567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-fernet-keys\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.134672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-credential-keys\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.134741 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-config-data\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.146686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-fernet-keys\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.147374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-credential-keys\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.159666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-config-data\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.160183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-combined-ca-bundle\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.160859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-scripts\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.183753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.184138 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhjv\" (UniqueName: \"kubernetes.io/projected/a78f6390-356b-4502-b065-ce16c7b2cfb3-kube-api-access-kvhjv\") pod \"keystone-bootstrap-nfmvr\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.195233 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rj68d"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.196531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.218667 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.221029 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.221075 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bkkgq" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.243630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.244909 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-294kk\" (UniqueName: \"kubernetes.io/projected/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-kube-api-access-294kk\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.244983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-config\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.245044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-combined-ca-bundle\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.261913 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rj68d"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.273733 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lq7sm"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.290220 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-s97ck"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.291630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.296689 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g2f4f" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.296828 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.296870 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.316452 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s97ck"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.337660 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-x6f5q"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.338721 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.346740 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.347015 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-59j9g" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.355719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-config\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.355788 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqwjv\" (UniqueName: \"kubernetes.io/projected/2ab7557b-b040-450a-b0ad-437720fab3a2-kube-api-access-nqwjv\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.355868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-combined-ca-bundle\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.355898 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-config-data\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.356062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-scripts\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.356124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-combined-ca-bundle\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.356148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ab7557b-b040-450a-b0ad-437720fab3a2-etc-machine-id\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.356213 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-294kk\" (UniqueName: \"kubernetes.io/projected/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-kube-api-access-294kk\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.356258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-db-sync-config-data\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.371111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-config\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.375679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-combined-ca-bundle\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.388432 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pfg8z"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.390652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-294kk\" (UniqueName: \"kubernetes.io/projected/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-kube-api-access-294kk\") pod \"neutron-db-sync-qtf89\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.394088 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.405670 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x6f5q"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.438565 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qtf89" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458389 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c154a8-70f6-41c6-9040-bfaea3b6caf1-logs\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8nr2\" (UniqueName: \"kubernetes.io/projected/66c154a8-70f6-41c6-9040-bfaea3b6caf1-kube-api-access-g8nr2\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-db-sync-config-data\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-combined-ca-bundle\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqwjv\" (UniqueName: \"kubernetes.io/projected/2ab7557b-b040-450a-b0ad-437720fab3a2-kube-api-access-nqwjv\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srt9w\" (UniqueName: \"kubernetes.io/projected/f9b28632-689c-484b-91b4-c57e9d67a6cf-kube-api-access-srt9w\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-config-data\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-scripts\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-combined-ca-bundle\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.458775 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pfg8z"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.459583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-config-data\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.459820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-db-sync-config-data\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.459883 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-scripts\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.459973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-combined-ca-bundle\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.460024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ab7557b-b040-450a-b0ad-437720fab3a2-etc-machine-id\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.460339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ab7557b-b040-450a-b0ad-437720fab3a2-etc-machine-id\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.464391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-scripts\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.465582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-config-data\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.477657 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-combined-ca-bundle\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.478043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-db-sync-config-data\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.480890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqwjv\" (UniqueName: \"kubernetes.io/projected/2ab7557b-b040-450a-b0ad-437720fab3a2-kube-api-access-nqwjv\") pod \"cinder-db-sync-rj68d\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.487315 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.489650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.492250 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.493920 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.505251 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562019 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562383 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn654\" (UniqueName: \"kubernetes.io/projected/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-kube-api-access-zn654\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-log-httpd\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c154a8-70f6-41c6-9040-bfaea3b6caf1-logs\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8nr2\" (UniqueName: \"kubernetes.io/projected/66c154a8-70f6-41c6-9040-bfaea3b6caf1-kube-api-access-g8nr2\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-scripts\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-combined-ca-bundle\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-run-httpd\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srt9w\" (UniqueName: \"kubernetes.io/projected/f9b28632-689c-484b-91b4-c57e9d67a6cf-kube-api-access-srt9w\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-config\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t4h4\" (UniqueName: \"kubernetes.io/projected/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-kube-api-access-9t4h4\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-scripts\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-combined-ca-bundle\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.562999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.563026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-config-data\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.563065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-config-data\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.563112 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-db-sync-config-data\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.565675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c154a8-70f6-41c6-9040-bfaea3b6caf1-logs\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.566727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-db-sync-config-data\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.573272 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-combined-ca-bundle\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.574083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-combined-ca-bundle\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.576752 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rj68d" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.582896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8nr2\" (UniqueName: \"kubernetes.io/projected/66c154a8-70f6-41c6-9040-bfaea3b6caf1-kube-api-access-g8nr2\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.585209 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srt9w\" (UniqueName: \"kubernetes.io/projected/f9b28632-689c-484b-91b4-c57e9d67a6cf-kube-api-access-srt9w\") pod \"barbican-db-sync-x6f5q\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.586669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-scripts\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.587286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-config-data\") pod \"placement-db-sync-s97ck\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.664536 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s97ck" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.664992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-config-data\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665213 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn654\" (UniqueName: \"kubernetes.io/projected/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-kube-api-access-zn654\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665293 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-log-httpd\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-scripts\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-run-httpd\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-config\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.665534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t4h4\" (UniqueName: \"kubernetes.io/projected/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-kube-api-access-9t4h4\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.666699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.667805 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.668583 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.668978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-log-httpd\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.669167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.669660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-config\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.669720 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-run-httpd\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.672566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-5c7gw" event={"ID":"cf32a260-17ee-431a-ab31-9b2215b6823f","Type":"ContainerDied","Data":"d52dd243062c67132014b6929531d8ddad7560fa9432236524994f7ebb91ff6a"} Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.675271 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d52dd243062c67132014b6929531d8ddad7560fa9432236524994f7ebb91ff6a" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.682819 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.686838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-config-data\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.693038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t4h4\" (UniqueName: \"kubernetes.io/projected/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-kube-api-access-9t4h4\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.698287 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.699955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn654\" (UniqueName: \"kubernetes.io/projected/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-kube-api-access-zn654\") pod \"dnsmasq-dns-785d8bcb8c-pfg8z\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.700547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.706631 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-scripts\") pod \"ceilometer-0\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.731400 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.758130 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.843814 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.964266 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lq7sm"] Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.975227 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-sb\") pod \"cf32a260-17ee-431a-ab31-9b2215b6823f\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.975284 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnphs\" (UniqueName: \"kubernetes.io/projected/cf32a260-17ee-431a-ab31-9b2215b6823f-kube-api-access-cnphs\") pod \"cf32a260-17ee-431a-ab31-9b2215b6823f\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.975312 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-dns-svc\") pod \"cf32a260-17ee-431a-ab31-9b2215b6823f\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.975437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-config\") pod \"cf32a260-17ee-431a-ab31-9b2215b6823f\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " Nov 29 01:32:41 crc kubenswrapper[4749]: I1129 01:32:41.975494 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-nb\") pod \"cf32a260-17ee-431a-ab31-9b2215b6823f\" (UID: \"cf32a260-17ee-431a-ab31-9b2215b6823f\") " Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.008022 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf32a260-17ee-431a-ab31-9b2215b6823f-kube-api-access-cnphs" (OuterVolumeSpecName: "kube-api-access-cnphs") pod "cf32a260-17ee-431a-ab31-9b2215b6823f" (UID: "cf32a260-17ee-431a-ab31-9b2215b6823f"). InnerVolumeSpecName "kube-api-access-cnphs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:42 crc kubenswrapper[4749]: W1129 01:32:42.012601 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f01aa57_9fab_45c8_b4a0_2b951813e4df.slice/crio-b02ce9dff8f4bc5c68690fecede828feed4f36d84891ce73d0f6f1a71ea42a56 WatchSource:0}: Error finding container b02ce9dff8f4bc5c68690fecede828feed4f36d84891ce73d0f6f1a71ea42a56: Status 404 returned error can't find the container with id b02ce9dff8f4bc5c68690fecede828feed4f36d84891ce73d0f6f1a71ea42a56 Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.025966 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:32:42 crc kubenswrapper[4749]: E1129 01:32:42.026332 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf32a260-17ee-431a-ab31-9b2215b6823f" containerName="init" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.026350 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf32a260-17ee-431a-ab31-9b2215b6823f" containerName="init" Nov 29 01:32:42 crc kubenswrapper[4749]: E1129 01:32:42.026374 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf32a260-17ee-431a-ab31-9b2215b6823f" containerName="dnsmasq-dns" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.026380 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf32a260-17ee-431a-ab31-9b2215b6823f" containerName="dnsmasq-dns" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.026533 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf32a260-17ee-431a-ab31-9b2215b6823f" containerName="dnsmasq-dns" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.027364 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.031621 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.031874 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.032458 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-brkw4" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.032563 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.065073 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.071911 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nfmvr"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.076965 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnphs\" (UniqueName: \"kubernetes.io/projected/cf32a260-17ee-431a-ab31-9b2215b6823f-kube-api-access-cnphs\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.100752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf32a260-17ee-431a-ab31-9b2215b6823f" (UID: "cf32a260-17ee-431a-ab31-9b2215b6823f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.108380 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-config" (OuterVolumeSpecName: "config") pod "cf32a260-17ee-431a-ab31-9b2215b6823f" (UID: "cf32a260-17ee-431a-ab31-9b2215b6823f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.115310 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf32a260-17ee-431a-ab31-9b2215b6823f" (UID: "cf32a260-17ee-431a-ab31-9b2215b6823f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.116391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf32a260-17ee-431a-ab31-9b2215b6823f" (UID: "cf32a260-17ee-431a-ab31-9b2215b6823f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.157782 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.159219 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.164514 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.164844 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.178443 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.179537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.179675 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.179818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.179844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.179899 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-logs\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.179923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.180021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qs2l\" (UniqueName: \"kubernetes.io/projected/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-kube-api-access-8qs2l\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.180111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.185282 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.186548 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.186565 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.186575 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf32a260-17ee-431a-ab31-9b2215b6823f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-logs\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288759 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qs2l\" (UniqueName: \"kubernetes.io/projected/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-kube-api-access-8qs2l\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v69df\" (UniqueName: \"kubernetes.io/projected/58b2b2df-fe58-4bc7-893f-4808b1822032-kube-api-access-v69df\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288865 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-logs\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288913 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.288989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.289010 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.289029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.289044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.289669 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.290354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-logs\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.290536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.297378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-config-data\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.298904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.301267 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-scripts\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.302485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.302609 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qtf89"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.316863 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qs2l\" (UniqueName: \"kubernetes.io/projected/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-kube-api-access-8qs2l\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.323465 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rj68d"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.339105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.380314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.391080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.391778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.392083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v69df\" (UniqueName: \"kubernetes.io/projected/58b2b2df-fe58-4bc7-893f-4808b1822032-kube-api-access-v69df\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.392173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-logs\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.392278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.392379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.392433 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.392481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.392568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.393920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-logs\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.394066 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.404318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.404845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.407771 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.430587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v69df\" (UniqueName: \"kubernetes.io/projected/58b2b2df-fe58-4bc7-893f-4808b1822032-kube-api-access-v69df\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.438585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.455681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.465144 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-s97ck"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.481106 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.488291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x6f5q"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.654655 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.668678 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pfg8z"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.721974 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rj68d" event={"ID":"2ab7557b-b040-450a-b0ad-437720fab3a2","Type":"ContainerStarted","Data":"4af4b725638963a5e22f9788f575f26d953d1ee9b6b5e019c867addaa01bef16"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.730691 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s97ck" event={"ID":"66c154a8-70f6-41c6-9040-bfaea3b6caf1","Type":"ContainerStarted","Data":"37589ae04a85bbe82354a922b8ea9f9eddb973efc5584959548d402d8712e29e"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.741738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfmvr" event={"ID":"a78f6390-356b-4502-b065-ce16c7b2cfb3","Type":"ContainerStarted","Data":"e5f55d830c3b9beb232739425c6d8c977604c4ed1e22fec74f43465868d1c470"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.741823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfmvr" event={"ID":"a78f6390-356b-4502-b065-ce16c7b2cfb3","Type":"ContainerStarted","Data":"782035dc25b064616f1e371d6a129a3bb42414d0a195ba3a94c1673d62fdaff8"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.754076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qtf89" event={"ID":"655fb5be-d1d0-4e82-bc40-f76dd4ddb133","Type":"ContainerStarted","Data":"79eea270e39e7f4cc2a391113d6a21f6ec7ca132b30b00de81c6ba37e4fee5f5"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.754113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qtf89" event={"ID":"655fb5be-d1d0-4e82-bc40-f76dd4ddb133","Type":"ContainerStarted","Data":"53ae2528f58cc816c3291c842179359f17991c6b4ebda0df6f212fb86d92db39"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.763787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x6f5q" event={"ID":"f9b28632-689c-484b-91b4-c57e9d67a6cf","Type":"ContainerStarted","Data":"0ce001e92b708434d80728c02b725f83c983d260bf5619ca96dd8e7cf98ad28a"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.768979 4749 generic.go:334] "Generic (PLEG): container finished" podID="9f01aa57-9fab-45c8-b4a0-2b951813e4df" containerID="099084021ef50798cce53861609f6928b0d659456d2c484fac09fe75e1c74207" exitCode=0 Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.769077 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-5c7gw" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.776738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" event={"ID":"9f01aa57-9fab-45c8-b4a0-2b951813e4df","Type":"ContainerDied","Data":"099084021ef50798cce53861609f6928b0d659456d2c484fac09fe75e1c74207"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.776784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" event={"ID":"9f01aa57-9fab-45c8-b4a0-2b951813e4df","Type":"ContainerStarted","Data":"b02ce9dff8f4bc5c68690fecede828feed4f36d84891ce73d0f6f1a71ea42a56"} Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.785652 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nfmvr" podStartSLOduration=2.785633487 podStartE2EDuration="2.785633487s" podCreationTimestamp="2025-11-29 01:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:42.764066819 +0000 UTC m=+1305.936216676" watchObservedRunningTime="2025-11-29 01:32:42.785633487 +0000 UTC m=+1305.957783344" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.790676 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qtf89" podStartSLOduration=1.790666523 podStartE2EDuration="1.790666523s" podCreationTimestamp="2025-11-29 01:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:42.78294307 +0000 UTC m=+1305.955092947" watchObservedRunningTime="2025-11-29 01:32:42.790666523 +0000 UTC m=+1305.962816380" Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.826932 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5c7gw"] Nov 29 01:32:42 crc kubenswrapper[4749]: I1129 01:32:42.830324 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-5c7gw"] Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.098180 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf32a260-17ee-431a-ab31-9b2215b6823f" path="/var/lib/kubelet/pods/cf32a260-17ee-431a-ab31-9b2215b6823f/volumes" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.101013 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:32:43 crc kubenswrapper[4749]: W1129 01:32:43.112231 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cf6fba7_8e2c_4890_bf97_6c1b6b97738c.slice/crio-61eef05999ea5fa3bb39f5abdc499fd10daea12f9f725206517c4f55f5c7c0e5 WatchSource:0}: Error finding container 61eef05999ea5fa3bb39f5abdc499fd10daea12f9f725206517c4f55f5c7c0e5: Status 404 returned error can't find the container with id 61eef05999ea5fa3bb39f5abdc499fd10daea12f9f725206517c4f55f5c7c0e5 Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.132100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.219861 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.232228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-sb\") pod \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.232304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-config\") pod \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.232360 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-svc\") pod \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.232407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-nb\") pod \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.232423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsbbn\" (UniqueName: \"kubernetes.io/projected/9f01aa57-9fab-45c8-b4a0-2b951813e4df-kube-api-access-vsbbn\") pod \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.232465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-swift-storage-0\") pod \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\" (UID: \"9f01aa57-9fab-45c8-b4a0-2b951813e4df\") " Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.241652 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f01aa57-9fab-45c8-b4a0-2b951813e4df-kube-api-access-vsbbn" (OuterVolumeSpecName: "kube-api-access-vsbbn") pod "9f01aa57-9fab-45c8-b4a0-2b951813e4df" (UID: "9f01aa57-9fab-45c8-b4a0-2b951813e4df"). InnerVolumeSpecName "kube-api-access-vsbbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.265914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-config" (OuterVolumeSpecName: "config") pod "9f01aa57-9fab-45c8-b4a0-2b951813e4df" (UID: "9f01aa57-9fab-45c8-b4a0-2b951813e4df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.269772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9f01aa57-9fab-45c8-b4a0-2b951813e4df" (UID: "9f01aa57-9fab-45c8-b4a0-2b951813e4df"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.274722 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f01aa57-9fab-45c8-b4a0-2b951813e4df" (UID: "9f01aa57-9fab-45c8-b4a0-2b951813e4df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.281562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f01aa57-9fab-45c8-b4a0-2b951813e4df" (UID: "9f01aa57-9fab-45c8-b4a0-2b951813e4df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.282691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f01aa57-9fab-45c8-b4a0-2b951813e4df" (UID: "9f01aa57-9fab-45c8-b4a0-2b951813e4df"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.334377 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.334408 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.334417 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.334427 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.334436 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsbbn\" (UniqueName: \"kubernetes.io/projected/9f01aa57-9fab-45c8-b4a0-2b951813e4df-kube-api-access-vsbbn\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.334445 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f01aa57-9fab-45c8-b4a0-2b951813e4df-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.789176 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" event={"ID":"9f01aa57-9fab-45c8-b4a0-2b951813e4df","Type":"ContainerDied","Data":"b02ce9dff8f4bc5c68690fecede828feed4f36d84891ce73d0f6f1a71ea42a56"} Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.789214 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-lq7sm" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.789513 4749 scope.go:117] "RemoveContainer" containerID="099084021ef50798cce53861609f6928b0d659456d2c484fac09fe75e1c74207" Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.791857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf","Type":"ContainerStarted","Data":"e4e07575f031401e6ef44dda63a148367d05c06bc1edad2f9b8db12771f61b40"} Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.798175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58b2b2df-fe58-4bc7-893f-4808b1822032","Type":"ContainerStarted","Data":"cbfaf246d8cd5185b7bf3d3b10a30e45b3dc359f4790a8f70093a661f5690fc9"} Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.800139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c","Type":"ContainerStarted","Data":"61eef05999ea5fa3bb39f5abdc499fd10daea12f9f725206517c4f55f5c7c0e5"} Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.804398 4749 generic.go:334] "Generic (PLEG): container finished" podID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" containerID="1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75" exitCode=0 Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.804447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" event={"ID":"9c4a57d2-c6ae-4669-9d0e-0283eebe5923","Type":"ContainerDied","Data":"1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75"} Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.804485 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" event={"ID":"9c4a57d2-c6ae-4669-9d0e-0283eebe5923","Type":"ContainerStarted","Data":"ad070fe038a78bdaca233a47eb7e02361ca2c2f9c7a0df9f255d712d87bf4c53"} Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.917624 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lq7sm"] Nov 29 01:32:43 crc kubenswrapper[4749]: I1129 01:32:43.962339 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-lq7sm"] Nov 29 01:32:44 crc kubenswrapper[4749]: I1129 01:32:44.641801 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:32:44 crc kubenswrapper[4749]: I1129 01:32:44.693875 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:32:44 crc kubenswrapper[4749]: I1129 01:32:44.825671 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:32:44 crc kubenswrapper[4749]: I1129 01:32:44.869501 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58b2b2df-fe58-4bc7-893f-4808b1822032","Type":"ContainerStarted","Data":"24627ecff2d38ac3dcdf5b2a08c23dd79a5a3c3ff1e4f822e89280caa68ecf50"} Nov 29 01:32:44 crc kubenswrapper[4749]: I1129 01:32:44.873546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c","Type":"ContainerStarted","Data":"ec5f308e74fd3cfe3113d0277eb7015905a7e6b49ba0e269551de0400baabe22"} Nov 29 01:32:44 crc kubenswrapper[4749]: I1129 01:32:44.876598 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" event={"ID":"9c4a57d2-c6ae-4669-9d0e-0283eebe5923","Type":"ContainerStarted","Data":"9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7"} Nov 29 01:32:44 crc kubenswrapper[4749]: I1129 01:32:44.877533 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:44 crc kubenswrapper[4749]: I1129 01:32:44.913492 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" podStartSLOduration=3.913472017 podStartE2EDuration="3.913472017s" podCreationTimestamp="2025-11-29 01:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:44.907026666 +0000 UTC m=+1308.079176533" watchObservedRunningTime="2025-11-29 01:32:44.913472017 +0000 UTC m=+1308.085621874" Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.101721 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f01aa57-9fab-45c8-b4a0-2b951813e4df" path="/var/lib/kubelet/pods/9f01aa57-9fab-45c8-b4a0-2b951813e4df/volumes" Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.911011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c","Type":"ContainerStarted","Data":"efb75c91a805af2554b96d7665ef60b66f6f9716803bc5115b7f724c2c7d930c"} Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.911222 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerName="glance-log" containerID="cri-o://ec5f308e74fd3cfe3113d0277eb7015905a7e6b49ba0e269551de0400baabe22" gracePeriod=30 Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.911294 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerName="glance-httpd" containerID="cri-o://efb75c91a805af2554b96d7665ef60b66f6f9716803bc5115b7f724c2c7d930c" gracePeriod=30 Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.916291 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58b2b2df-fe58-4bc7-893f-4808b1822032","Type":"ContainerStarted","Data":"60c518371eec99e11dbf56d97bbeedd0be5855f73a83628f5705eb4b245ae62e"} Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.916388 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerName="glance-log" containerID="cri-o://24627ecff2d38ac3dcdf5b2a08c23dd79a5a3c3ff1e4f822e89280caa68ecf50" gracePeriod=30 Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.916431 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerName="glance-httpd" containerID="cri-o://60c518371eec99e11dbf56d97bbeedd0be5855f73a83628f5705eb4b245ae62e" gracePeriod=30 Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.932838 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.932818194 podStartE2EDuration="5.932818194s" podCreationTimestamp="2025-11-29 01:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:45.930608459 +0000 UTC m=+1309.102758316" watchObservedRunningTime="2025-11-29 01:32:45.932818194 +0000 UTC m=+1309.104968051" Nov 29 01:32:45 crc kubenswrapper[4749]: I1129 01:32:45.956581 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.956560117 podStartE2EDuration="4.956560117s" podCreationTimestamp="2025-11-29 01:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:32:45.954802623 +0000 UTC m=+1309.126952480" watchObservedRunningTime="2025-11-29 01:32:45.956560117 +0000 UTC m=+1309.128709984" Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.928247 4749 generic.go:334] "Generic (PLEG): container finished" podID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerID="60c518371eec99e11dbf56d97bbeedd0be5855f73a83628f5705eb4b245ae62e" exitCode=0 Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.928558 4749 generic.go:334] "Generic (PLEG): container finished" podID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerID="24627ecff2d38ac3dcdf5b2a08c23dd79a5a3c3ff1e4f822e89280caa68ecf50" exitCode=143 Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.928411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58b2b2df-fe58-4bc7-893f-4808b1822032","Type":"ContainerDied","Data":"60c518371eec99e11dbf56d97bbeedd0be5855f73a83628f5705eb4b245ae62e"} Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.928642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58b2b2df-fe58-4bc7-893f-4808b1822032","Type":"ContainerDied","Data":"24627ecff2d38ac3dcdf5b2a08c23dd79a5a3c3ff1e4f822e89280caa68ecf50"} Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.930886 4749 generic.go:334] "Generic (PLEG): container finished" podID="a78f6390-356b-4502-b065-ce16c7b2cfb3" containerID="e5f55d830c3b9beb232739425c6d8c977604c4ed1e22fec74f43465868d1c470" exitCode=0 Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.930962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfmvr" event={"ID":"a78f6390-356b-4502-b065-ce16c7b2cfb3","Type":"ContainerDied","Data":"e5f55d830c3b9beb232739425c6d8c977604c4ed1e22fec74f43465868d1c470"} Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.934166 4749 generic.go:334] "Generic (PLEG): container finished" podID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerID="efb75c91a805af2554b96d7665ef60b66f6f9716803bc5115b7f724c2c7d930c" exitCode=0 Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.934200 4749 generic.go:334] "Generic (PLEG): container finished" podID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerID="ec5f308e74fd3cfe3113d0277eb7015905a7e6b49ba0e269551de0400baabe22" exitCode=143 Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.934237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c","Type":"ContainerDied","Data":"efb75c91a805af2554b96d7665ef60b66f6f9716803bc5115b7f724c2c7d930c"} Nov 29 01:32:46 crc kubenswrapper[4749]: I1129 01:32:46.934259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c","Type":"ContainerDied","Data":"ec5f308e74fd3cfe3113d0277eb7015905a7e6b49ba0e269551de0400baabe22"} Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.801741 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.968789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfmvr" event={"ID":"a78f6390-356b-4502-b065-ce16c7b2cfb3","Type":"ContainerDied","Data":"782035dc25b064616f1e371d6a129a3bb42414d0a195ba3a94c1673d62fdaff8"} Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.968834 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="782035dc25b064616f1e371d6a129a3bb42414d0a195ba3a94c1673d62fdaff8" Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.968909 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfmvr" Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.977907 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-config-data\") pod \"a78f6390-356b-4502-b065-ce16c7b2cfb3\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.978084 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-credential-keys\") pod \"a78f6390-356b-4502-b065-ce16c7b2cfb3\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.978133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-fernet-keys\") pod \"a78f6390-356b-4502-b065-ce16c7b2cfb3\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.978238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvhjv\" (UniqueName: \"kubernetes.io/projected/a78f6390-356b-4502-b065-ce16c7b2cfb3-kube-api-access-kvhjv\") pod \"a78f6390-356b-4502-b065-ce16c7b2cfb3\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.978304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-scripts\") pod \"a78f6390-356b-4502-b065-ce16c7b2cfb3\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.978409 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-combined-ca-bundle\") pod \"a78f6390-356b-4502-b065-ce16c7b2cfb3\" (UID: \"a78f6390-356b-4502-b065-ce16c7b2cfb3\") " Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.985691 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a78f6390-356b-4502-b065-ce16c7b2cfb3" (UID: "a78f6390-356b-4502-b065-ce16c7b2cfb3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.989404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78f6390-356b-4502-b065-ce16c7b2cfb3-kube-api-access-kvhjv" (OuterVolumeSpecName: "kube-api-access-kvhjv") pod "a78f6390-356b-4502-b065-ce16c7b2cfb3" (UID: "a78f6390-356b-4502-b065-ce16c7b2cfb3"). InnerVolumeSpecName "kube-api-access-kvhjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.996382 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-scripts" (OuterVolumeSpecName: "scripts") pod "a78f6390-356b-4502-b065-ce16c7b2cfb3" (UID: "a78f6390-356b-4502-b065-ce16c7b2cfb3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:48 crc kubenswrapper[4749]: I1129 01:32:48.996420 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a78f6390-356b-4502-b065-ce16c7b2cfb3" (UID: "a78f6390-356b-4502-b065-ce16c7b2cfb3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.016022 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a78f6390-356b-4502-b065-ce16c7b2cfb3" (UID: "a78f6390-356b-4502-b065-ce16c7b2cfb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.016454 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-config-data" (OuterVolumeSpecName: "config-data") pod "a78f6390-356b-4502-b065-ce16c7b2cfb3" (UID: "a78f6390-356b-4502-b065-ce16c7b2cfb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.080249 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.080490 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.080500 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvhjv\" (UniqueName: \"kubernetes.io/projected/a78f6390-356b-4502-b065-ce16c7b2cfb3-kube-api-access-kvhjv\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.080510 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.080518 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.080526 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a78f6390-356b-4502-b065-ce16c7b2cfb3-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.140148 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nfmvr"] Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.148584 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nfmvr"] Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.217459 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bvc2f"] Nov 29 01:32:49 crc kubenswrapper[4749]: E1129 01:32:49.217852 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78f6390-356b-4502-b065-ce16c7b2cfb3" containerName="keystone-bootstrap" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.217868 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78f6390-356b-4502-b065-ce16c7b2cfb3" containerName="keystone-bootstrap" Nov 29 01:32:49 crc kubenswrapper[4749]: E1129 01:32:49.217881 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f01aa57-9fab-45c8-b4a0-2b951813e4df" containerName="init" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.217889 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f01aa57-9fab-45c8-b4a0-2b951813e4df" containerName="init" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.218132 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78f6390-356b-4502-b065-ce16c7b2cfb3" containerName="keystone-bootstrap" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.218147 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f01aa57-9fab-45c8-b4a0-2b951813e4df" containerName="init" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.218823 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.227165 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bvc2f"] Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.385151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-config-data\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.385222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-combined-ca-bundle\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.385252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-scripts\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.385293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-fernet-keys\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.385340 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd5v8\" (UniqueName: \"kubernetes.io/projected/3a041864-fa44-412d-a3ef-0d1af966cd48-kube-api-access-rd5v8\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.385385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-credential-keys\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.487718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd5v8\" (UniqueName: \"kubernetes.io/projected/3a041864-fa44-412d-a3ef-0d1af966cd48-kube-api-access-rd5v8\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.488101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-credential-keys\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.490036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-config-data\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.490076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-combined-ca-bundle\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.490114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-scripts\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.490229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-fernet-keys\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.494766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-scripts\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.494913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-combined-ca-bundle\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.495152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-credential-keys\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.496170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-fernet-keys\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.503079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-config-data\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.508667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd5v8\" (UniqueName: \"kubernetes.io/projected/3a041864-fa44-412d-a3ef-0d1af966cd48-kube-api-access-rd5v8\") pod \"keystone-bootstrap-bvc2f\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:49 crc kubenswrapper[4749]: I1129 01:32:49.545294 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:32:51 crc kubenswrapper[4749]: I1129 01:32:51.091504 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78f6390-356b-4502-b065-ce16c7b2cfb3" path="/var/lib/kubelet/pods/a78f6390-356b-4502-b065-ce16c7b2cfb3/volumes" Nov 29 01:32:51 crc kubenswrapper[4749]: I1129 01:32:51.733456 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:32:51 crc kubenswrapper[4749]: I1129 01:32:51.793261 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6npfw"] Nov 29 01:32:51 crc kubenswrapper[4749]: I1129 01:32:51.793490 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerName="dnsmasq-dns" containerID="cri-o://194cc581f99d9f64096ae76ff8e5e139c8f7f96baf89a98b338fec5972a28c60" gracePeriod=10 Nov 29 01:32:53 crc kubenswrapper[4749]: I1129 01:32:53.014285 4749 generic.go:334] "Generic (PLEG): container finished" podID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerID="194cc581f99d9f64096ae76ff8e5e139c8f7f96baf89a98b338fec5972a28c60" exitCode=0 Nov 29 01:32:53 crc kubenswrapper[4749]: I1129 01:32:53.014400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" event={"ID":"2b8e5b07-b7b6-4f46-81bf-5229500201ca","Type":"ContainerDied","Data":"194cc581f99d9f64096ae76ff8e5e139c8f7f96baf89a98b338fec5972a28c60"} Nov 29 01:32:54 crc kubenswrapper[4749]: I1129 01:32:54.252843 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Nov 29 01:32:55 crc kubenswrapper[4749]: I1129 01:32:55.374653 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:32:55 crc kubenswrapper[4749]: I1129 01:32:55.374751 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:32:56 crc kubenswrapper[4749]: E1129 01:32:56.191162 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 29 01:32:56 crc kubenswrapper[4749]: E1129 01:32:56.191555 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-srt9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-x6f5q_openstack(f9b28632-689c-484b-91b4-c57e9d67a6cf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:32:56 crc kubenswrapper[4749]: E1129 01:32:56.192744 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-x6f5q" podUID="f9b28632-689c-484b-91b4-c57e9d67a6cf" Nov 29 01:32:57 crc kubenswrapper[4749]: E1129 01:32:57.057947 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-x6f5q" podUID="f9b28632-689c-484b-91b4-c57e9d67a6cf" Nov 29 01:32:59 crc kubenswrapper[4749]: I1129 01:32:59.252748 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.118440 4749 generic.go:334] "Generic (PLEG): container finished" podID="655fb5be-d1d0-4e82-bc40-f76dd4ddb133" containerID="79eea270e39e7f4cc2a391113d6a21f6ec7ca132b30b00de81c6ba37e4fee5f5" exitCode=0 Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.118541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qtf89" event={"ID":"655fb5be-d1d0-4e82-bc40-f76dd4ddb133","Type":"ContainerDied","Data":"79eea270e39e7f4cc2a391113d6a21f6ec7ca132b30b00de81c6ba37e4fee5f5"} Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.427717 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.443868 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548115 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-httpd-run\") pod \"58b2b2df-fe58-4bc7-893f-4808b1822032\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548543 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-combined-ca-bundle\") pod \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548600 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-combined-ca-bundle\") pod \"58b2b2df-fe58-4bc7-893f-4808b1822032\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-httpd-run\") pod \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-scripts\") pod \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548739 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-logs\") pod \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548768 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v69df\" (UniqueName: \"kubernetes.io/projected/58b2b2df-fe58-4bc7-893f-4808b1822032-kube-api-access-v69df\") pod \"58b2b2df-fe58-4bc7-893f-4808b1822032\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548811 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-scripts\") pod \"58b2b2df-fe58-4bc7-893f-4808b1822032\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"58b2b2df-fe58-4bc7-893f-4808b1822032\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.548873 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-logs\") pod \"58b2b2df-fe58-4bc7-893f-4808b1822032\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.549016 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-config-data\") pod \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.549051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" (UID: "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.549069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-internal-tls-certs\") pod \"58b2b2df-fe58-4bc7-893f-4808b1822032\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.549152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-config-data\") pod \"58b2b2df-fe58-4bc7-893f-4808b1822032\" (UID: \"58b2b2df-fe58-4bc7-893f-4808b1822032\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.549187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-public-tls-certs\") pod \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.549239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qs2l\" (UniqueName: \"kubernetes.io/projected/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-kube-api-access-8qs2l\") pod \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.549278 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\" (UID: \"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.549990 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.550324 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "58b2b2df-fe58-4bc7-893f-4808b1822032" (UID: "58b2b2df-fe58-4bc7-893f-4808b1822032"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.551132 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-logs" (OuterVolumeSpecName: "logs") pod "58b2b2df-fe58-4bc7-893f-4808b1822032" (UID: "58b2b2df-fe58-4bc7-893f-4808b1822032"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.555917 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-logs" (OuterVolumeSpecName: "logs") pod "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" (UID: "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.559122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-scripts" (OuterVolumeSpecName: "scripts") pod "58b2b2df-fe58-4bc7-893f-4808b1822032" (UID: "58b2b2df-fe58-4bc7-893f-4808b1822032"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.559528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" (UID: "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.560358 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b2b2df-fe58-4bc7-893f-4808b1822032-kube-api-access-v69df" (OuterVolumeSpecName: "kube-api-access-v69df") pod "58b2b2df-fe58-4bc7-893f-4808b1822032" (UID: "58b2b2df-fe58-4bc7-893f-4808b1822032"). InnerVolumeSpecName "kube-api-access-v69df". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.562973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-kube-api-access-8qs2l" (OuterVolumeSpecName: "kube-api-access-8qs2l") pod "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" (UID: "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c"). InnerVolumeSpecName "kube-api-access-8qs2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.570601 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-scripts" (OuterVolumeSpecName: "scripts") pod "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" (UID: "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.588001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "58b2b2df-fe58-4bc7-893f-4808b1822032" (UID: "58b2b2df-fe58-4bc7-893f-4808b1822032"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.612521 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58b2b2df-fe58-4bc7-893f-4808b1822032" (UID: "58b2b2df-fe58-4bc7-893f-4808b1822032"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.630671 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" (UID: "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.635262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58b2b2df-fe58-4bc7-893f-4808b1822032" (UID: "58b2b2df-fe58-4bc7-893f-4808b1822032"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.642786 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-config-data" (OuterVolumeSpecName: "config-data") pod "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" (UID: "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.647475 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-config-data" (OuterVolumeSpecName: "config-data") pod "58b2b2df-fe58-4bc7-893f-4808b1822032" (UID: "58b2b2df-fe58-4bc7-893f-4808b1822032"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651843 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651874 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v69df\" (UniqueName: \"kubernetes.io/projected/58b2b2df-fe58-4bc7-893f-4808b1822032-kube-api-access-v69df\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651884 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651909 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651920 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651928 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651958 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651970 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651980 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qs2l\" (UniqueName: \"kubernetes.io/projected/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-kube-api-access-8qs2l\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.651994 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.652007 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58b2b2df-fe58-4bc7-893f-4808b1822032-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.652016 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.652025 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b2b2df-fe58-4bc7-893f-4808b1822032-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.652036 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.671527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" (UID: "3cf6fba7-8e2c-4890-bf97-6c1b6b97738c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.674948 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.675213 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.753746 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.753778 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.753789 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:03 crc kubenswrapper[4749]: E1129 01:33:03.914437 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Nov 29 01:33:03 crc kubenswrapper[4749]: E1129 01:33:03.914680 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n587h64h66bhf7h57dh5bbhf6h5dh666h658h5ch58bh56dh5d9h5c5h687h67dh69h54ch54fh5bdh648h657h679h676h575hb4h66bh67dh67ch677h97q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t4h4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.923388 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.959867 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-config\") pod \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.960324 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7rks\" (UniqueName: \"kubernetes.io/projected/2b8e5b07-b7b6-4f46-81bf-5229500201ca-kube-api-access-b7rks\") pod \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.960415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-swift-storage-0\") pod \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.960511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-sb\") pod \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.963407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-nb\") pod \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.963625 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-svc\") pod \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\" (UID: \"2b8e5b07-b7b6-4f46-81bf-5229500201ca\") " Nov 29 01:33:03 crc kubenswrapper[4749]: I1129 01:33:03.967689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8e5b07-b7b6-4f46-81bf-5229500201ca-kube-api-access-b7rks" (OuterVolumeSpecName: "kube-api-access-b7rks") pod "2b8e5b07-b7b6-4f46-81bf-5229500201ca" (UID: "2b8e5b07-b7b6-4f46-81bf-5229500201ca"). InnerVolumeSpecName "kube-api-access-b7rks". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.012457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-config" (OuterVolumeSpecName: "config") pod "2b8e5b07-b7b6-4f46-81bf-5229500201ca" (UID: "2b8e5b07-b7b6-4f46-81bf-5229500201ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.012649 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2b8e5b07-b7b6-4f46-81bf-5229500201ca" (UID: "2b8e5b07-b7b6-4f46-81bf-5229500201ca"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.017833 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b8e5b07-b7b6-4f46-81bf-5229500201ca" (UID: "2b8e5b07-b7b6-4f46-81bf-5229500201ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.023145 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b8e5b07-b7b6-4f46-81bf-5229500201ca" (UID: "2b8e5b07-b7b6-4f46-81bf-5229500201ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.029293 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b8e5b07-b7b6-4f46-81bf-5229500201ca" (UID: "2b8e5b07-b7b6-4f46-81bf-5229500201ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.066109 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.066161 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7rks\" (UniqueName: \"kubernetes.io/projected/2b8e5b07-b7b6-4f46-81bf-5229500201ca-kube-api-access-b7rks\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.066178 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.066189 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.066215 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.066226 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b8e5b07-b7b6-4f46-81bf-5229500201ca-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.132107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58b2b2df-fe58-4bc7-893f-4808b1822032","Type":"ContainerDied","Data":"cbfaf246d8cd5185b7bf3d3b10a30e45b3dc359f4790a8f70093a661f5690fc9"} Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.132164 4749 scope.go:117] "RemoveContainer" containerID="60c518371eec99e11dbf56d97bbeedd0be5855f73a83628f5705eb4b245ae62e" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.132183 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.134952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" event={"ID":"2b8e5b07-b7b6-4f46-81bf-5229500201ca","Type":"ContainerDied","Data":"5e7ce0a4cd1e42ffb00de6514596ceb511d7cb9827f36fa6f47db3ea70e490e1"} Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.135029 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-6npfw" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.151617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3cf6fba7-8e2c-4890-bf97-6c1b6b97738c","Type":"ContainerDied","Data":"61eef05999ea5fa3bb39f5abdc499fd10daea12f9f725206517c4f55f5c7c0e5"} Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.151718 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.201773 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.213721 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.222357 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6npfw"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.230603 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-6npfw"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246261 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:33:04 crc kubenswrapper[4749]: E1129 01:33:04.246681 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerName="glance-log" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246697 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerName="glance-log" Nov 29 01:33:04 crc kubenswrapper[4749]: E1129 01:33:04.246704 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerName="init" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246711 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerName="init" Nov 29 01:33:04 crc kubenswrapper[4749]: E1129 01:33:04.246724 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerName="glance-httpd" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246731 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerName="glance-httpd" Nov 29 01:33:04 crc kubenswrapper[4749]: E1129 01:33:04.246740 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerName="dnsmasq-dns" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246748 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerName="dnsmasq-dns" Nov 29 01:33:04 crc kubenswrapper[4749]: E1129 01:33:04.246767 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerName="glance-log" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246773 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerName="glance-log" Nov 29 01:33:04 crc kubenswrapper[4749]: E1129 01:33:04.246787 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerName="glance-httpd" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246794 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerName="glance-httpd" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246948 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerName="glance-log" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246963 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerName="glance-httpd" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246975 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" containerName="dnsmasq-dns" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246982 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" containerName="glance-log" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.246991 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" containerName="glance-httpd" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.247896 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.252592 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.252647 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-brkw4" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.252970 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.253113 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.265854 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.282257 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.294587 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.299546 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.301216 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.303723 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.304003 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.307782 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.375816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.375871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.375918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.375940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.375960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-logs\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.375979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8f6\" (UniqueName: \"kubernetes.io/projected/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-kube-api-access-mw8f6\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376512 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-logs\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c28vk\" (UniqueName: \"kubernetes.io/projected/b390a04f-fb35-4166-af23-0b735e2f5266-kube-api-access-c28vk\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.376722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.499855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.499916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.499965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.499994 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-logs\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c28vk\" (UniqueName: \"kubernetes.io/projected/b390a04f-fb35-4166-af23-0b735e2f5266-kube-api-access-c28vk\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500050 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500245 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500271 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-logs\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500718 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.501064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.500274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.505159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-logs\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.505208 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8f6\" (UniqueName: \"kubernetes.io/projected/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-kube-api-access-mw8f6\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.505230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.505267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.505382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.505978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-logs\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.506176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.509318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.509499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.509873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.510278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.510850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-config-data\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.512077 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-scripts\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.515938 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c28vk\" (UniqueName: \"kubernetes.io/projected/b390a04f-fb35-4166-af23-0b735e2f5266-kube-api-access-c28vk\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.523478 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.531711 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8f6\" (UniqueName: \"kubernetes.io/projected/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-kube-api-access-mw8f6\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.533654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.534853 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " pod="openstack/glance-default-external-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.577012 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:04 crc kubenswrapper[4749]: I1129 01:33:04.672115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.085885 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8e5b07-b7b6-4f46-81bf-5229500201ca" path="/var/lib/kubelet/pods/2b8e5b07-b7b6-4f46-81bf-5229500201ca/volumes" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.087149 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf6fba7-8e2c-4890-bf97-6c1b6b97738c" path="/var/lib/kubelet/pods/3cf6fba7-8e2c-4890-bf97-6c1b6b97738c/volumes" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.087827 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58b2b2df-fe58-4bc7-893f-4808b1822032" path="/var/lib/kubelet/pods/58b2b2df-fe58-4bc7-893f-4808b1822032/volumes" Nov 29 01:33:05 crc kubenswrapper[4749]: E1129 01:33:05.341397 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 29 01:33:05 crc kubenswrapper[4749]: E1129 01:33:05.341577 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nqwjv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rj68d_openstack(2ab7557b-b040-450a-b0ad-437720fab3a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 01:33:05 crc kubenswrapper[4749]: E1129 01:33:05.342792 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rj68d" podUID="2ab7557b-b040-450a-b0ad-437720fab3a2" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.360964 4749 scope.go:117] "RemoveContainer" containerID="24627ecff2d38ac3dcdf5b2a08c23dd79a5a3c3ff1e4f822e89280caa68ecf50" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.513842 4749 scope.go:117] "RemoveContainer" containerID="194cc581f99d9f64096ae76ff8e5e139c8f7f96baf89a98b338fec5972a28c60" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.520090 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qtf89" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.577909 4749 scope.go:117] "RemoveContainer" containerID="da2b71583938cd818f6cb4c15d838cd882df14adc0c7852627aff51e6faaa49c" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.632425 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-config\") pod \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.632521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-294kk\" (UniqueName: \"kubernetes.io/projected/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-kube-api-access-294kk\") pod \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.632577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-combined-ca-bundle\") pod \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\" (UID: \"655fb5be-d1d0-4e82-bc40-f76dd4ddb133\") " Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.639156 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-kube-api-access-294kk" (OuterVolumeSpecName: "kube-api-access-294kk") pod "655fb5be-d1d0-4e82-bc40-f76dd4ddb133" (UID: "655fb5be-d1d0-4e82-bc40-f76dd4ddb133"). InnerVolumeSpecName "kube-api-access-294kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.643865 4749 scope.go:117] "RemoveContainer" containerID="efb75c91a805af2554b96d7665ef60b66f6f9716803bc5115b7f724c2c7d930c" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.662614 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-config" (OuterVolumeSpecName: "config") pod "655fb5be-d1d0-4e82-bc40-f76dd4ddb133" (UID: "655fb5be-d1d0-4e82-bc40-f76dd4ddb133"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.664253 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "655fb5be-d1d0-4e82-bc40-f76dd4ddb133" (UID: "655fb5be-d1d0-4e82-bc40-f76dd4ddb133"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.674471 4749 scope.go:117] "RemoveContainer" containerID="ec5f308e74fd3cfe3113d0277eb7015905a7e6b49ba0e269551de0400baabe22" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.734454 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.734492 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.734506 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-294kk\" (UniqueName: \"kubernetes.io/projected/655fb5be-d1d0-4e82-bc40-f76dd4ddb133-kube-api-access-294kk\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.818326 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bvc2f"] Nov 29 01:33:05 crc kubenswrapper[4749]: W1129 01:33:05.828976 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a041864_fa44_412d_a3ef_0d1af966cd48.slice/crio-96e447b587a87077f9bea8d543d13754b72681b75b91f62620bd763326bd989f WatchSource:0}: Error finding container 96e447b587a87077f9bea8d543d13754b72681b75b91f62620bd763326bd989f: Status 404 returned error can't find the container with id 96e447b587a87077f9bea8d543d13754b72681b75b91f62620bd763326bd989f Nov 29 01:33:05 crc kubenswrapper[4749]: I1129 01:33:05.980432 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.069396 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:33:06 crc kubenswrapper[4749]: W1129 01:33:06.163987 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb390a04f_fb35_4166_af23_0b735e2f5266.slice/crio-547a4afd7f21e5d654d8e0386c8b5c88797377a66e39773b43f77bc6eaa388b3 WatchSource:0}: Error finding container 547a4afd7f21e5d654d8e0386c8b5c88797377a66e39773b43f77bc6eaa388b3: Status 404 returned error can't find the container with id 547a4afd7f21e5d654d8e0386c8b5c88797377a66e39773b43f77bc6eaa388b3 Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.177272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s97ck" event={"ID":"66c154a8-70f6-41c6-9040-bfaea3b6caf1","Type":"ContainerStarted","Data":"fb2620ea85039940ac22ecc63eb62adbadd8cd8f12ac051b31c46aa498066f54"} Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.184722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qtf89" event={"ID":"655fb5be-d1d0-4e82-bc40-f76dd4ddb133","Type":"ContainerDied","Data":"53ae2528f58cc816c3291c842179359f17991c6b4ebda0df6f212fb86d92db39"} Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.184765 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53ae2528f58cc816c3291c842179359f17991c6b4ebda0df6f212fb86d92db39" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.184815 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qtf89" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.186363 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bvc2f" event={"ID":"3a041864-fa44-412d-a3ef-0d1af966cd48","Type":"ContainerStarted","Data":"96e447b587a87077f9bea8d543d13754b72681b75b91f62620bd763326bd989f"} Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.188526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a","Type":"ContainerStarted","Data":"074d319b9bc89fa3dc87e67d039854264afb00b251bc2b5d1fbe7402a12bfbb8"} Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.198429 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-s97ck" podStartSLOduration=2.372074717 podStartE2EDuration="25.198377024s" podCreationTimestamp="2025-11-29 01:32:41 +0000 UTC" firstStartedPulling="2025-11-29 01:32:42.517069152 +0000 UTC m=+1305.689219009" lastFinishedPulling="2025-11-29 01:33:05.343371419 +0000 UTC m=+1328.515521316" observedRunningTime="2025-11-29 01:33:06.19141064 +0000 UTC m=+1329.363560497" watchObservedRunningTime="2025-11-29 01:33:06.198377024 +0000 UTC m=+1329.370526881" Nov 29 01:33:06 crc kubenswrapper[4749]: E1129 01:33:06.211133 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-rj68d" podUID="2ab7557b-b040-450a-b0ad-437720fab3a2" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.693888 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bzvtk"] Nov 29 01:33:06 crc kubenswrapper[4749]: E1129 01:33:06.694338 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655fb5be-d1d0-4e82-bc40-f76dd4ddb133" containerName="neutron-db-sync" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.694352 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="655fb5be-d1d0-4e82-bc40-f76dd4ddb133" containerName="neutron-db-sync" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.694555 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="655fb5be-d1d0-4e82-bc40-f76dd4ddb133" containerName="neutron-db-sync" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.695452 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.704496 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bzvtk"] Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.774820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.775261 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-config\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.775293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.775507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.775683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d64dc\" (UniqueName: \"kubernetes.io/projected/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-kube-api-access-d64dc\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.775757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.825846 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79cf68bf7b-r88sb"] Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.827769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.831793 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v6hwt" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.831938 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.832229 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.836426 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.843185 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79cf68bf7b-r88sb"] Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.877588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.877680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-config\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.877709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.877776 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.877856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d64dc\" (UniqueName: \"kubernetes.io/projected/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-kube-api-access-d64dc\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.877895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.878644 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.879153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.879176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.882535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-config\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.883096 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.908506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d64dc\" (UniqueName: \"kubernetes.io/projected/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-kube-api-access-d64dc\") pod \"dnsmasq-dns-55f844cf75-bzvtk\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.979994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfjts\" (UniqueName: \"kubernetes.io/projected/436900b7-4647-44d8-8f0f-6e6077747800-kube-api-access-nfjts\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.980052 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-combined-ca-bundle\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.980137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-httpd-config\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.980164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-ovndb-tls-certs\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:06 crc kubenswrapper[4749]: I1129 01:33:06.980240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-config\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.038005 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.081278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-config\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.081409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfjts\" (UniqueName: \"kubernetes.io/projected/436900b7-4647-44d8-8f0f-6e6077747800-kube-api-access-nfjts\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.081436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-combined-ca-bundle\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.081494 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-httpd-config\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.081518 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-ovndb-tls-certs\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.086864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-httpd-config\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.087052 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-ovndb-tls-certs\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.088984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-config\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.091164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-combined-ca-bundle\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.103287 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfjts\" (UniqueName: \"kubernetes.io/projected/436900b7-4647-44d8-8f0f-6e6077747800-kube-api-access-nfjts\") pod \"neutron-79cf68bf7b-r88sb\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.153528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.215787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b390a04f-fb35-4166-af23-0b735e2f5266","Type":"ContainerStarted","Data":"a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2"} Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.215838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b390a04f-fb35-4166-af23-0b735e2f5266","Type":"ContainerStarted","Data":"547a4afd7f21e5d654d8e0386c8b5c88797377a66e39773b43f77bc6eaa388b3"} Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.226603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf","Type":"ContainerStarted","Data":"9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601"} Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.230367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bvc2f" event={"ID":"3a041864-fa44-412d-a3ef-0d1af966cd48","Type":"ContainerStarted","Data":"aafe1964fd0d38ca73b43970e806be9da911509377abf07f7b6364eee4b52252"} Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.236893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a","Type":"ContainerStarted","Data":"63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce"} Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.256989 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bvc2f" podStartSLOduration=18.256971381 podStartE2EDuration="18.256971381s" podCreationTimestamp="2025-11-29 01:32:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:07.248573502 +0000 UTC m=+1330.420723379" watchObservedRunningTime="2025-11-29 01:33:07.256971381 +0000 UTC m=+1330.429121238" Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.561321 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bzvtk"] Nov 29 01:33:07 crc kubenswrapper[4749]: I1129 01:33:07.799209 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79cf68bf7b-r88sb"] Nov 29 01:33:07 crc kubenswrapper[4749]: W1129 01:33:07.806642 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod436900b7_4647_44d8_8f0f_6e6077747800.slice/crio-5ac1432c9f6f89391100108cd3e553c15c6f329d150aa4b6283f64a7b8917600 WatchSource:0}: Error finding container 5ac1432c9f6f89391100108cd3e553c15c6f329d150aa4b6283f64a7b8917600: Status 404 returned error can't find the container with id 5ac1432c9f6f89391100108cd3e553c15c6f329d150aa4b6283f64a7b8917600 Nov 29 01:33:08 crc kubenswrapper[4749]: I1129 01:33:08.245997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" event={"ID":"7eee8ed5-a3fb-4460-8b03-337b30a8dd61","Type":"ContainerStarted","Data":"6a437d33013501e2accd706f0f94d6d9dd39be2c618d10f039634c4c4f469544"} Nov 29 01:33:08 crc kubenswrapper[4749]: I1129 01:33:08.247210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cf68bf7b-r88sb" event={"ID":"436900b7-4647-44d8-8f0f-6e6077747800","Type":"ContainerStarted","Data":"5ac1432c9f6f89391100108cd3e553c15c6f329d150aa4b6283f64a7b8917600"} Nov 29 01:33:08 crc kubenswrapper[4749]: I1129 01:33:08.249190 4749 generic.go:334] "Generic (PLEG): container finished" podID="66c154a8-70f6-41c6-9040-bfaea3b6caf1" containerID="fb2620ea85039940ac22ecc63eb62adbadd8cd8f12ac051b31c46aa498066f54" exitCode=0 Nov 29 01:33:08 crc kubenswrapper[4749]: I1129 01:33:08.249989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s97ck" event={"ID":"66c154a8-70f6-41c6-9040-bfaea3b6caf1","Type":"ContainerDied","Data":"fb2620ea85039940ac22ecc63eb62adbadd8cd8f12ac051b31c46aa498066f54"} Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.266919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a","Type":"ContainerStarted","Data":"0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834"} Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.275528 4749 generic.go:334] "Generic (PLEG): container finished" podID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerID="eca228af450820e842a5822f41fa9aa8f60e1ad79808b3b12e63ff431558e7c6" exitCode=0 Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.275616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" event={"ID":"7eee8ed5-a3fb-4460-8b03-337b30a8dd61","Type":"ContainerDied","Data":"eca228af450820e842a5822f41fa9aa8f60e1ad79808b3b12e63ff431558e7c6"} Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.304325 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cf68bf7b-r88sb" event={"ID":"436900b7-4647-44d8-8f0f-6e6077747800","Type":"ContainerStarted","Data":"92274068016fa71ba48799703446fbcf030ea448fcc91ca833ddc199876bd97f"} Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.316697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b390a04f-fb35-4166-af23-0b735e2f5266","Type":"ContainerStarted","Data":"49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4"} Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.330263 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.330236269 podStartE2EDuration="5.330236269s" podCreationTimestamp="2025-11-29 01:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:09.299902842 +0000 UTC m=+1332.472052699" watchObservedRunningTime="2025-11-29 01:33:09.330236269 +0000 UTC m=+1332.502386126" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.373332 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.373310004 podStartE2EDuration="5.373310004s" podCreationTimestamp="2025-11-29 01:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:09.347695275 +0000 UTC m=+1332.519845132" watchObservedRunningTime="2025-11-29 01:33:09.373310004 +0000 UTC m=+1332.545459861" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.706857 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-75b4fcb8ff-s4n9j"] Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.709576 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.717608 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.718510 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.737560 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s97ck" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.742256 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75b4fcb8ff-s4n9j"] Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-combined-ca-bundle\") pod \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-scripts\") pod \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c154a8-70f6-41c6-9040-bfaea3b6caf1-logs\") pod \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-config-data\") pod \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8nr2\" (UniqueName: \"kubernetes.io/projected/66c154a8-70f6-41c6-9040-bfaea3b6caf1-kube-api-access-g8nr2\") pod \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\" (UID: \"66c154a8-70f6-41c6-9040-bfaea3b6caf1\") " Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-public-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-httpd-config\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837613 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-combined-ca-bundle\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837654 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvv5\" (UniqueName: \"kubernetes.io/projected/04501dca-4c62-4065-abb5-fbdfb9ce76fc-kube-api-access-gwvv5\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-config\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-internal-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.837720 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-ovndb-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.838620 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c154a8-70f6-41c6-9040-bfaea3b6caf1-logs" (OuterVolumeSpecName: "logs") pod "66c154a8-70f6-41c6-9040-bfaea3b6caf1" (UID: "66c154a8-70f6-41c6-9040-bfaea3b6caf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.843637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-scripts" (OuterVolumeSpecName: "scripts") pod "66c154a8-70f6-41c6-9040-bfaea3b6caf1" (UID: "66c154a8-70f6-41c6-9040-bfaea3b6caf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.847823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c154a8-70f6-41c6-9040-bfaea3b6caf1-kube-api-access-g8nr2" (OuterVolumeSpecName: "kube-api-access-g8nr2") pod "66c154a8-70f6-41c6-9040-bfaea3b6caf1" (UID: "66c154a8-70f6-41c6-9040-bfaea3b6caf1"). InnerVolumeSpecName "kube-api-access-g8nr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.866617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66c154a8-70f6-41c6-9040-bfaea3b6caf1" (UID: "66c154a8-70f6-41c6-9040-bfaea3b6caf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.877613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-config-data" (OuterVolumeSpecName: "config-data") pod "66c154a8-70f6-41c6-9040-bfaea3b6caf1" (UID: "66c154a8-70f6-41c6-9040-bfaea3b6caf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.938920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-public-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.938958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-httpd-config\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.938986 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-combined-ca-bundle\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.939029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvv5\" (UniqueName: \"kubernetes.io/projected/04501dca-4c62-4065-abb5-fbdfb9ce76fc-kube-api-access-gwvv5\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.939063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-config\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.939084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-internal-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.939099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-ovndb-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.939189 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66c154a8-70f6-41c6-9040-bfaea3b6caf1-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.939213 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.939222 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8nr2\" (UniqueName: \"kubernetes.io/projected/66c154a8-70f6-41c6-9040-bfaea3b6caf1-kube-api-access-g8nr2\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.939235 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.940276 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66c154a8-70f6-41c6-9040-bfaea3b6caf1-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.953822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-ovndb-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.954877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-httpd-config\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.955059 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-public-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.955961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-internal-tls-certs\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.959071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-config\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.959591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvv5\" (UniqueName: \"kubernetes.io/projected/04501dca-4c62-4065-abb5-fbdfb9ce76fc-kube-api-access-gwvv5\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:09 crc kubenswrapper[4749]: I1129 01:33:09.960410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-combined-ca-bundle\") pod \"neutron-75b4fcb8ff-s4n9j\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.041730 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.329151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-s97ck" event={"ID":"66c154a8-70f6-41c6-9040-bfaea3b6caf1","Type":"ContainerDied","Data":"37589ae04a85bbe82354a922b8ea9f9eddb973efc5584959548d402d8712e29e"} Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.329474 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37589ae04a85bbe82354a922b8ea9f9eddb973efc5584959548d402d8712e29e" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.329240 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-s97ck" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.334681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" event={"ID":"7eee8ed5-a3fb-4460-8b03-337b30a8dd61","Type":"ContainerStarted","Data":"3beff1d10e953e55a3b106b089faaec16567d6ed10bb480994464b9f19994b2d"} Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.334838 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.343607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cf68bf7b-r88sb" event={"ID":"436900b7-4647-44d8-8f0f-6e6077747800","Type":"ContainerStarted","Data":"378ad2fd5a8829a92d3577ec6ec91a92b3e091baf09191f73f6dd468ff370b3c"} Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.343850 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.380804 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" podStartSLOduration=4.380784795 podStartE2EDuration="4.380784795s" podCreationTimestamp="2025-11-29 01:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:10.355657068 +0000 UTC m=+1333.527806925" watchObservedRunningTime="2025-11-29 01:33:10.380784795 +0000 UTC m=+1333.552934652" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.397243 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78bc78f9d8-g85sc"] Nov 29 01:33:10 crc kubenswrapper[4749]: E1129 01:33:10.398290 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c154a8-70f6-41c6-9040-bfaea3b6caf1" containerName="placement-db-sync" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.398424 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c154a8-70f6-41c6-9040-bfaea3b6caf1" containerName="placement-db-sync" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.398678 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c154a8-70f6-41c6-9040-bfaea3b6caf1" containerName="placement-db-sync" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.399678 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.401688 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79cf68bf7b-r88sb" podStartSLOduration=4.401671197 podStartE2EDuration="4.401671197s" podCreationTimestamp="2025-11-29 01:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:10.388709473 +0000 UTC m=+1333.560859330" watchObservedRunningTime="2025-11-29 01:33:10.401671197 +0000 UTC m=+1333.573821054" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.405705 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.405888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.405992 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-g2f4f" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.406130 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.406264 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.440258 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78bc78f9d8-g85sc"] Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.552699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5vdc\" (UniqueName: \"kubernetes.io/projected/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-kube-api-access-w5vdc\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.552778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-scripts\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.552798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-logs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.552836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-public-tls-certs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.552863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-internal-tls-certs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.552926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-config-data\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.552941 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-combined-ca-bundle\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.656114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-config-data\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.656219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-combined-ca-bundle\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.656262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5vdc\" (UniqueName: \"kubernetes.io/projected/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-kube-api-access-w5vdc\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.656355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-scripts\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.656412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-logs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.656495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-public-tls-certs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.656560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-internal-tls-certs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.657294 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-logs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.666718 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-config-data\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.667401 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-scripts\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.667651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-combined-ca-bundle\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.670012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-public-tls-certs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.673628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-internal-tls-certs\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.674872 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5vdc\" (UniqueName: \"kubernetes.io/projected/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-kube-api-access-w5vdc\") pod \"placement-78bc78f9d8-g85sc\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.742729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:10 crc kubenswrapper[4749]: I1129 01:33:10.785910 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-75b4fcb8ff-s4n9j"] Nov 29 01:33:11 crc kubenswrapper[4749]: I1129 01:33:11.358877 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x6f5q" event={"ID":"f9b28632-689c-484b-91b4-c57e9d67a6cf","Type":"ContainerStarted","Data":"ff6a195164c1a51091d5571bc5e7947391e0cf746aff0edfbb3435b23fa14721"} Nov 29 01:33:11 crc kubenswrapper[4749]: I1129 01:33:11.380469 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-x6f5q" podStartSLOduration=3.010526956 podStartE2EDuration="30.380451672s" podCreationTimestamp="2025-11-29 01:32:41 +0000 UTC" firstStartedPulling="2025-11-29 01:32:42.516221061 +0000 UTC m=+1305.688370918" lastFinishedPulling="2025-11-29 01:33:09.886145777 +0000 UTC m=+1333.058295634" observedRunningTime="2025-11-29 01:33:11.374660887 +0000 UTC m=+1334.546810764" watchObservedRunningTime="2025-11-29 01:33:11.380451672 +0000 UTC m=+1334.552601529" Nov 29 01:33:12 crc kubenswrapper[4749]: I1129 01:33:12.366670 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a041864-fa44-412d-a3ef-0d1af966cd48" containerID="aafe1964fd0d38ca73b43970e806be9da911509377abf07f7b6364eee4b52252" exitCode=0 Nov 29 01:33:12 crc kubenswrapper[4749]: I1129 01:33:12.366814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bvc2f" event={"ID":"3a041864-fa44-412d-a3ef-0d1af966cd48","Type":"ContainerDied","Data":"aafe1964fd0d38ca73b43970e806be9da911509377abf07f7b6364eee4b52252"} Nov 29 01:33:14 crc kubenswrapper[4749]: I1129 01:33:14.577357 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:14 crc kubenswrapper[4749]: I1129 01:33:14.577724 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:14 crc kubenswrapper[4749]: I1129 01:33:14.627767 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:14 crc kubenswrapper[4749]: I1129 01:33:14.665263 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:14 crc kubenswrapper[4749]: I1129 01:33:14.673054 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 01:33:14 crc kubenswrapper[4749]: I1129 01:33:14.673114 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 01:33:14 crc kubenswrapper[4749]: I1129 01:33:14.722833 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 01:33:14 crc kubenswrapper[4749]: I1129 01:33:14.733155 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 01:33:15 crc kubenswrapper[4749]: I1129 01:33:15.408682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 01:33:15 crc kubenswrapper[4749]: I1129 01:33:15.408735 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:15 crc kubenswrapper[4749]: I1129 01:33:15.408752 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 01:33:15 crc kubenswrapper[4749]: I1129 01:33:15.409075 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.040450 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.117330 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pfg8z"] Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.117797 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" podUID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" containerName="dnsmasq-dns" containerID="cri-o://9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7" gracePeriod=10 Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.433197 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9b28632-689c-484b-91b4-c57e9d67a6cf" containerID="ff6a195164c1a51091d5571bc5e7947391e0cf746aff0edfbb3435b23fa14721" exitCode=0 Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.433263 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x6f5q" event={"ID":"f9b28632-689c-484b-91b4-c57e9d67a6cf","Type":"ContainerDied","Data":"ff6a195164c1a51091d5571bc5e7947391e0cf746aff0edfbb3435b23fa14721"} Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.628273 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.628622 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.643426 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.880766 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:17 crc kubenswrapper[4749]: I1129 01:33:17.880887 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.046302 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.103731 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.138528 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-config-data\") pod \"3a041864-fa44-412d-a3ef-0d1af966cd48\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.138653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd5v8\" (UniqueName: \"kubernetes.io/projected/3a041864-fa44-412d-a3ef-0d1af966cd48-kube-api-access-rd5v8\") pod \"3a041864-fa44-412d-a3ef-0d1af966cd48\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.138718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-credential-keys\") pod \"3a041864-fa44-412d-a3ef-0d1af966cd48\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.138758 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-scripts\") pod \"3a041864-fa44-412d-a3ef-0d1af966cd48\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.138793 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-fernet-keys\") pod \"3a041864-fa44-412d-a3ef-0d1af966cd48\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.138858 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-combined-ca-bundle\") pod \"3a041864-fa44-412d-a3ef-0d1af966cd48\" (UID: \"3a041864-fa44-412d-a3ef-0d1af966cd48\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.144853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-scripts" (OuterVolumeSpecName: "scripts") pod "3a041864-fa44-412d-a3ef-0d1af966cd48" (UID: "3a041864-fa44-412d-a3ef-0d1af966cd48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.152093 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a041864-fa44-412d-a3ef-0d1af966cd48-kube-api-access-rd5v8" (OuterVolumeSpecName: "kube-api-access-rd5v8") pod "3a041864-fa44-412d-a3ef-0d1af966cd48" (UID: "3a041864-fa44-412d-a3ef-0d1af966cd48"). InnerVolumeSpecName "kube-api-access-rd5v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.153105 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3a041864-fa44-412d-a3ef-0d1af966cd48" (UID: "3a041864-fa44-412d-a3ef-0d1af966cd48"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.169980 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3a041864-fa44-412d-a3ef-0d1af966cd48" (UID: "3a041864-fa44-412d-a3ef-0d1af966cd48"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.193325 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-config-data" (OuterVolumeSpecName: "config-data") pod "3a041864-fa44-412d-a3ef-0d1af966cd48" (UID: "3a041864-fa44-412d-a3ef-0d1af966cd48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.226365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a041864-fa44-412d-a3ef-0d1af966cd48" (UID: "3a041864-fa44-412d-a3ef-0d1af966cd48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.277082 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.277119 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.277131 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd5v8\" (UniqueName: \"kubernetes.io/projected/3a041864-fa44-412d-a3ef-0d1af966cd48-kube-api-access-rd5v8\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.277146 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.277160 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.277171 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3a041864-fa44-412d-a3ef-0d1af966cd48-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.339433 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.462673 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bvc2f" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.462682 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bvc2f" event={"ID":"3a041864-fa44-412d-a3ef-0d1af966cd48","Type":"ContainerDied","Data":"96e447b587a87077f9bea8d543d13754b72681b75b91f62620bd763326bd989f"} Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.462717 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96e447b587a87077f9bea8d543d13754b72681b75b91f62620bd763326bd989f" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.469263 4749 generic.go:334] "Generic (PLEG): container finished" podID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" containerID="9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7" exitCode=0 Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.469320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" event={"ID":"9c4a57d2-c6ae-4669-9d0e-0283eebe5923","Type":"ContainerDied","Data":"9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7"} Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.469348 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" event={"ID":"9c4a57d2-c6ae-4669-9d0e-0283eebe5923","Type":"ContainerDied","Data":"ad070fe038a78bdaca233a47eb7e02361ca2c2f9c7a0df9f255d712d87bf4c53"} Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.469363 4749 scope.go:117] "RemoveContainer" containerID="9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.469476 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pfg8z" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.481853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-nb\") pod \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.481914 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-config\") pod \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.481960 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-sb\") pod \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.482055 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-swift-storage-0\") pod \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.482132 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-svc\") pod \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.482158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zn654\" (UniqueName: \"kubernetes.io/projected/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-kube-api-access-zn654\") pod \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\" (UID: \"9c4a57d2-c6ae-4669-9d0e-0283eebe5923\") " Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.483977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b4fcb8ff-s4n9j" event={"ID":"04501dca-4c62-4065-abb5-fbdfb9ce76fc","Type":"ContainerStarted","Data":"ab7a0c1ee75630fab0b25790d247b8e662bf1512dec0c4315a9cb9eb4a24637d"} Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.501061 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-kube-api-access-zn654" (OuterVolumeSpecName: "kube-api-access-zn654") pod "9c4a57d2-c6ae-4669-9d0e-0283eebe5923" (UID: "9c4a57d2-c6ae-4669-9d0e-0283eebe5923"). InnerVolumeSpecName "kube-api-access-zn654". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.512620 4749 scope.go:117] "RemoveContainer" containerID="1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.547040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c4a57d2-c6ae-4669-9d0e-0283eebe5923" (UID: "9c4a57d2-c6ae-4669-9d0e-0283eebe5923"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.547429 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9c4a57d2-c6ae-4669-9d0e-0283eebe5923" (UID: "9c4a57d2-c6ae-4669-9d0e-0283eebe5923"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.549838 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c4a57d2-c6ae-4669-9d0e-0283eebe5923" (UID: "9c4a57d2-c6ae-4669-9d0e-0283eebe5923"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.560035 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-config" (OuterVolumeSpecName: "config") pod "9c4a57d2-c6ae-4669-9d0e-0283eebe5923" (UID: "9c4a57d2-c6ae-4669-9d0e-0283eebe5923"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.564749 4749 scope.go:117] "RemoveContainer" containerID="9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7" Nov 29 01:33:18 crc kubenswrapper[4749]: E1129 01:33:18.565183 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7\": container with ID starting with 9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7 not found: ID does not exist" containerID="9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.565227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c4a57d2-c6ae-4669-9d0e-0283eebe5923" (UID: "9c4a57d2-c6ae-4669-9d0e-0283eebe5923"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.565236 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7"} err="failed to get container status \"9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7\": rpc error: code = NotFound desc = could not find container \"9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7\": container with ID starting with 9369a006b521a24506a498b64c23fc5df3bfd7f258abfb7966cbffea520e33b7 not found: ID does not exist" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.565264 4749 scope.go:117] "RemoveContainer" containerID="1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75" Nov 29 01:33:18 crc kubenswrapper[4749]: E1129 01:33:18.565650 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75\": container with ID starting with 1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75 not found: ID does not exist" containerID="1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.565683 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75"} err="failed to get container status \"1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75\": rpc error: code = NotFound desc = could not find container \"1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75\": container with ID starting with 1c4a689285bdc7c425b0b62f6253ab343ae642aa6aa0f37d9b404a594fdc7e75 not found: ID does not exist" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.584218 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.584256 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.584266 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zn654\" (UniqueName: \"kubernetes.io/projected/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-kube-api-access-zn654\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.584278 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.584715 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.584731 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c4a57d2-c6ae-4669-9d0e-0283eebe5923-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.721396 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78bc78f9d8-g85sc"] Nov 29 01:33:18 crc kubenswrapper[4749]: W1129 01:33:18.813702 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd3ce27f_fb0b_45a9_99e8_e6e98c5a17ee.slice/crio-f02706978598899783ee8425f468bbc0b7a506676a6f50d0825f628c8152bd66 WatchSource:0}: Error finding container f02706978598899783ee8425f468bbc0b7a506676a6f50d0825f628c8152bd66: Status 404 returned error can't find the container with id f02706978598899783ee8425f468bbc0b7a506676a6f50d0825f628c8152bd66 Nov 29 01:33:18 crc kubenswrapper[4749]: I1129 01:33:18.993819 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.055984 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pfg8z"] Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.091615 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pfg8z"] Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.098058 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srt9w\" (UniqueName: \"kubernetes.io/projected/f9b28632-689c-484b-91b4-c57e9d67a6cf-kube-api-access-srt9w\") pod \"f9b28632-689c-484b-91b4-c57e9d67a6cf\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.098129 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-combined-ca-bundle\") pod \"f9b28632-689c-484b-91b4-c57e9d67a6cf\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.098270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-db-sync-config-data\") pod \"f9b28632-689c-484b-91b4-c57e9d67a6cf\" (UID: \"f9b28632-689c-484b-91b4-c57e9d67a6cf\") " Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.125733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9b28632-689c-484b-91b4-c57e9d67a6cf" (UID: "f9b28632-689c-484b-91b4-c57e9d67a6cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.126481 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f9b28632-689c-484b-91b4-c57e9d67a6cf" (UID: "f9b28632-689c-484b-91b4-c57e9d67a6cf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.127511 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9b28632-689c-484b-91b4-c57e9d67a6cf-kube-api-access-srt9w" (OuterVolumeSpecName: "kube-api-access-srt9w") pod "f9b28632-689c-484b-91b4-c57e9d67a6cf" (UID: "f9b28632-689c-484b-91b4-c57e9d67a6cf"). InnerVolumeSpecName "kube-api-access-srt9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.193693 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5784c8bdbd-lvrsx"] Nov 29 01:33:19 crc kubenswrapper[4749]: E1129 01:33:19.194287 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9b28632-689c-484b-91b4-c57e9d67a6cf" containerName="barbican-db-sync" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.194305 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9b28632-689c-484b-91b4-c57e9d67a6cf" containerName="barbican-db-sync" Nov 29 01:33:19 crc kubenswrapper[4749]: E1129 01:33:19.194321 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" containerName="dnsmasq-dns" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.194336 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" containerName="dnsmasq-dns" Nov 29 01:33:19 crc kubenswrapper[4749]: E1129 01:33:19.194355 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" containerName="init" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.194362 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" containerName="init" Nov 29 01:33:19 crc kubenswrapper[4749]: E1129 01:33:19.194375 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a041864-fa44-412d-a3ef-0d1af966cd48" containerName="keystone-bootstrap" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.194382 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a041864-fa44-412d-a3ef-0d1af966cd48" containerName="keystone-bootstrap" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.194541 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9b28632-689c-484b-91b4-c57e9d67a6cf" containerName="barbican-db-sync" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.194557 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" containerName="dnsmasq-dns" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.194564 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a041864-fa44-412d-a3ef-0d1af966cd48" containerName="keystone-bootstrap" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.196168 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.204220 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.204251 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.204348 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.204425 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.204439 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.204468 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5mw78" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.205546 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srt9w\" (UniqueName: \"kubernetes.io/projected/f9b28632-689c-484b-91b4-c57e9d67a6cf-kube-api-access-srt9w\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.205579 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.205593 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f9b28632-689c-484b-91b4-c57e9d67a6cf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.214587 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5784c8bdbd-lvrsx"] Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.307763 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-credential-keys\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.307827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-internal-tls-certs\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.307857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdpzl\" (UniqueName: \"kubernetes.io/projected/28df216a-4f1e-449f-aaf6-45fd12929ad8-kube-api-access-mdpzl\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.307942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-config-data\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.308024 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-scripts\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.308064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-fernet-keys\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.308121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-public-tls-certs\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.308159 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-combined-ca-bundle\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.409789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-scripts\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.409895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-fernet-keys\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.410653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-public-tls-certs\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.410692 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-combined-ca-bundle\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.410783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-credential-keys\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.410823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-internal-tls-certs\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.410872 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdpzl\" (UniqueName: \"kubernetes.io/projected/28df216a-4f1e-449f-aaf6-45fd12929ad8-kube-api-access-mdpzl\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.410906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-config-data\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.413484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-scripts\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.413763 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-fernet-keys\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.414355 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-combined-ca-bundle\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.415636 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-internal-tls-certs\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.415971 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-config-data\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.417383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-public-tls-certs\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.427292 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-credential-keys\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.430564 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdpzl\" (UniqueName: \"kubernetes.io/projected/28df216a-4f1e-449f-aaf6-45fd12929ad8-kube-api-access-mdpzl\") pod \"keystone-5784c8bdbd-lvrsx\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.492953 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78bc78f9d8-g85sc" event={"ID":"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee","Type":"ContainerStarted","Data":"f02706978598899783ee8425f468bbc0b7a506676a6f50d0825f628c8152bd66"} Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.495186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x6f5q" event={"ID":"f9b28632-689c-484b-91b4-c57e9d67a6cf","Type":"ContainerDied","Data":"0ce001e92b708434d80728c02b725f83c983d260bf5619ca96dd8e7cf98ad28a"} Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.495245 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ce001e92b708434d80728c02b725f83c983d260bf5619ca96dd8e7cf98ad28a" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.495422 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x6f5q" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.532462 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.737263 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b8cf6c56c-s8qz2"] Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.739213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.743428 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-59j9g" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.743627 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.743780 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.773277 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b8cf6c56c-s8qz2"] Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.812325 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-56469f8b8-ckfjz"] Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.817628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-logs\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.817685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.817728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data-custom\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.817751 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8gjw\" (UniqueName: \"kubernetes.io/projected/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-kube-api-access-d8gjw\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.817780 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-combined-ca-bundle\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.826118 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.832273 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.907668 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56469f8b8-ckfjz"] Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.920319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.920585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzhv\" (UniqueName: \"kubernetes.io/projected/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-kube-api-access-5vzhv\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.920666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-logs\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.920748 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-combined-ca-bundle\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.920829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.920924 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data-custom\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.920993 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data-custom\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.921067 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8gjw\" (UniqueName: \"kubernetes.io/projected/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-kube-api-access-d8gjw\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.921148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-combined-ca-bundle\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.921238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-logs\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.921739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-logs\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.923569 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sv7q6"] Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.925217 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.934341 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-combined-ca-bundle\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.935792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.936659 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data-custom\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.960162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8gjw\" (UniqueName: \"kubernetes.io/projected/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-kube-api-access-d8gjw\") pod \"barbican-worker-6b8cf6c56c-s8qz2\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:19 crc kubenswrapper[4749]: I1129 01:33:19.965092 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sv7q6"] Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:19.995586 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-546b495778-5rd58"] Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:19.997166 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.002560 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.022627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.023169 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzhv\" (UniqueName: \"kubernetes.io/projected/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-kube-api-access-5vzhv\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.023671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-combined-ca-bundle\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.023769 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.023877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.024017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-config\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.024082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-svc\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.024169 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data-custom\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.024264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.024404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-logs\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.026591 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fszm7\" (UniqueName: \"kubernetes.io/projected/fecf6c48-a0e6-425b-b071-d54cc01b6751-kube-api-access-fszm7\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.027321 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-546b495778-5rd58"] Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.029009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-logs\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.035319 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.040900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data-custom\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: E1129 01:33:20.044616 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b28632_689c_484b_91b4_c57e9d67a6cf.slice/crio-0ce001e92b708434d80728c02b725f83c983d260bf5619ca96dd8e7cf98ad28a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b28632_689c_484b_91b4_c57e9d67a6cf.slice\": RecentStats: unable to find data in memory cache]" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.052653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-combined-ca-bundle\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.067386 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzhv\" (UniqueName: \"kubernetes.io/projected/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-kube-api-access-5vzhv\") pod \"barbican-keystone-listener-56469f8b8-ckfjz\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data-custom\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zcrw\" (UniqueName: \"kubernetes.io/projected/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-kube-api-access-9zcrw\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-combined-ca-bundle\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-config\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-svc\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129903 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-logs\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.129927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fszm7\" (UniqueName: \"kubernetes.io/projected/fecf6c48-a0e6-425b-b071-d54cc01b6751-kube-api-access-fszm7\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.133332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.133917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-config\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.135580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.136071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-svc\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.136666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.141260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.149312 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fszm7\" (UniqueName: \"kubernetes.io/projected/fecf6c48-a0e6-425b-b071-d54cc01b6751-kube-api-access-fszm7\") pod \"dnsmasq-dns-85ff748b95-sv7q6\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.162492 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.202944 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5784c8bdbd-lvrsx"] Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.231430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-logs\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.231517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data-custom\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.231555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zcrw\" (UniqueName: \"kubernetes.io/projected/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-kube-api-access-9zcrw\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.231576 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.231637 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-combined-ca-bundle\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.266625 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-logs\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.273802 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.286610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zcrw\" (UniqueName: \"kubernetes.io/projected/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-kube-api-access-9zcrw\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.286860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-combined-ca-bundle\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.288845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.289290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data-custom\") pod \"barbican-api-546b495778-5rd58\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.305023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.548008 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5784c8bdbd-lvrsx" event={"ID":"28df216a-4f1e-449f-aaf6-45fd12929ad8","Type":"ContainerStarted","Data":"06b139fb24804461bcd182a769a5966891a8c20fa02074406e9d9c9291d59711"} Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.575393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b4fcb8ff-s4n9j" event={"ID":"04501dca-4c62-4065-abb5-fbdfb9ce76fc","Type":"ContainerStarted","Data":"87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d"} Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.595021 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56469f8b8-ckfjz"] Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.615335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78bc78f9d8-g85sc" event={"ID":"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee","Type":"ContainerStarted","Data":"e944097794538d3c014641110ebd81f38c1dcad22ea2f4aa4c69b522c4b836ee"} Nov 29 01:33:20 crc kubenswrapper[4749]: I1129 01:33:20.878248 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b8cf6c56c-s8qz2"] Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.103753 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c4a57d2-c6ae-4669-9d0e-0283eebe5923" path="/var/lib/kubelet/pods/9c4a57d2-c6ae-4669-9d0e-0283eebe5923/volumes" Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.235602 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-546b495778-5rd58"] Nov 29 01:33:21 crc kubenswrapper[4749]: W1129 01:33:21.269880 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdc02fc7_2dd5_44ea_9dce_75b4f1009fe5.slice/crio-623531fb3e2b980d32733388fd650c87dc04a8cf9aa5a6100b6d0ed40e63631c WatchSource:0}: Error finding container 623531fb3e2b980d32733388fd650c87dc04a8cf9aa5a6100b6d0ed40e63631c: Status 404 returned error can't find the container with id 623531fb3e2b980d32733388fd650c87dc04a8cf9aa5a6100b6d0ed40e63631c Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.270854 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sv7q6"] Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.647558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78bc78f9d8-g85sc" event={"ID":"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee","Type":"ContainerStarted","Data":"e2aab48732b202c3e72e85886215cd2e441185bbc74a2b999dc3bff8161959f4"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.647769 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.648081 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.650567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5784c8bdbd-lvrsx" event={"ID":"28df216a-4f1e-449f-aaf6-45fd12929ad8","Type":"ContainerStarted","Data":"41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.651066 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.653065 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" event={"ID":"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba","Type":"ContainerStarted","Data":"06b0e720ffdf3ec5411ab49d6dd1866ab6e45dffdb43a288497d385ec8e175b5"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.655037 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b4fcb8ff-s4n9j" event={"ID":"04501dca-4c62-4065-abb5-fbdfb9ce76fc","Type":"ContainerStarted","Data":"12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.655362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.658560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf","Type":"ContainerStarted","Data":"12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.659891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" event={"ID":"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4","Type":"ContainerStarted","Data":"9f3b91d6584f697e324f451409378a786e169b21cd99c1d50e0345989aeddf8c"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.664215 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" event={"ID":"fecf6c48-a0e6-425b-b071-d54cc01b6751","Type":"ContainerStarted","Data":"0d5a589cfcaf2f0a639e6ed588c4e37b1712179544ef1a7769bab8a1ab5cd856"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.664261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" event={"ID":"fecf6c48-a0e6-425b-b071-d54cc01b6751","Type":"ContainerStarted","Data":"5f0b0e68a52651d0d877ab8d1537036912f0fb9f8b9fee86fe5902b77a52b67f"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.669938 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546b495778-5rd58" event={"ID":"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5","Type":"ContainerStarted","Data":"30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.670105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546b495778-5rd58" event={"ID":"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5","Type":"ContainerStarted","Data":"623531fb3e2b980d32733388fd650c87dc04a8cf9aa5a6100b6d0ed40e63631c"} Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.680604 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78bc78f9d8-g85sc" podStartSLOduration=11.680583059 podStartE2EDuration="11.680583059s" podCreationTimestamp="2025-11-29 01:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:21.665069222 +0000 UTC m=+1344.837219079" watchObservedRunningTime="2025-11-29 01:33:21.680583059 +0000 UTC m=+1344.852732916" Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.694524 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-75b4fcb8ff-s4n9j" podStartSLOduration=12.694181369 podStartE2EDuration="12.694181369s" podCreationTimestamp="2025-11-29 01:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:21.686451266 +0000 UTC m=+1344.858601123" watchObservedRunningTime="2025-11-29 01:33:21.694181369 +0000 UTC m=+1344.866331226" Nov 29 01:33:21 crc kubenswrapper[4749]: I1129 01:33:21.742731 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5784c8bdbd-lvrsx" podStartSLOduration=2.74271154 podStartE2EDuration="2.74271154s" podCreationTimestamp="2025-11-29 01:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:21.724966557 +0000 UTC m=+1344.897116424" watchObservedRunningTime="2025-11-29 01:33:21.74271154 +0000 UTC m=+1344.914861397" Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.697791 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rj68d" event={"ID":"2ab7557b-b040-450a-b0ad-437720fab3a2","Type":"ContainerStarted","Data":"89b878f4648a1b7a255f4ce926f9a55d53da0950d5f663bde6352f702c375d11"} Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.703864 4749 generic.go:334] "Generic (PLEG): container finished" podID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerID="0d5a589cfcaf2f0a639e6ed588c4e37b1712179544ef1a7769bab8a1ab5cd856" exitCode=0 Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.703966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" event={"ID":"fecf6c48-a0e6-425b-b071-d54cc01b6751","Type":"ContainerDied","Data":"0d5a589cfcaf2f0a639e6ed588c4e37b1712179544ef1a7769bab8a1ab5cd856"} Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.704002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" event={"ID":"fecf6c48-a0e6-425b-b071-d54cc01b6751","Type":"ContainerStarted","Data":"e1483d26072704b5d06655f4f2dd2df7fba170852801a8c9219385a870ef2482"} Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.704231 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.706539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546b495778-5rd58" event={"ID":"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5","Type":"ContainerStarted","Data":"5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53"} Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.720871 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rj68d" podStartSLOduration=3.242465385 podStartE2EDuration="41.720847648s" podCreationTimestamp="2025-11-29 01:32:41 +0000 UTC" firstStartedPulling="2025-11-29 01:32:42.332669549 +0000 UTC m=+1305.504819406" lastFinishedPulling="2025-11-29 01:33:20.811051812 +0000 UTC m=+1343.983201669" observedRunningTime="2025-11-29 01:33:22.715919955 +0000 UTC m=+1345.888069822" watchObservedRunningTime="2025-11-29 01:33:22.720847648 +0000 UTC m=+1345.892997505" Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.745963 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" podStartSLOduration=3.745944564 podStartE2EDuration="3.745944564s" podCreationTimestamp="2025-11-29 01:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:22.740115159 +0000 UTC m=+1345.912265026" watchObservedRunningTime="2025-11-29 01:33:22.745944564 +0000 UTC m=+1345.918094421" Nov 29 01:33:22 crc kubenswrapper[4749]: I1129 01:33:22.781171 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-546b495778-5rd58" podStartSLOduration=3.781147763 podStartE2EDuration="3.781147763s" podCreationTimestamp="2025-11-29 01:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:22.759531223 +0000 UTC m=+1345.931681080" watchObservedRunningTime="2025-11-29 01:33:22.781147763 +0000 UTC m=+1345.953297620" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.693428 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-744d76c7bb-6xh5q"] Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.707299 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.714326 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.714550 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.721800 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-744d76c7bb-6xh5q"] Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.773858 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" event={"ID":"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba","Type":"ContainerStarted","Data":"3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e"} Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.774262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" event={"ID":"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba","Type":"ContainerStarted","Data":"d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09"} Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.786928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" event={"ID":"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4","Type":"ContainerStarted","Data":"ac6b10cbf89933157f87ce0832e498f620cc616f9e2d34fbbac8faa2f0a1cdbd"} Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.787315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" event={"ID":"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4","Type":"ContainerStarted","Data":"23c1f9a9695d62882d05b68f2a362774fd211e09dcdb543ab2bb762d27b58e29"} Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.788235 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.788432 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.849396 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" podStartSLOduration=2.778115524 podStartE2EDuration="4.849377811s" podCreationTimestamp="2025-11-29 01:33:19 +0000 UTC" firstStartedPulling="2025-11-29 01:33:20.881688605 +0000 UTC m=+1344.053838452" lastFinishedPulling="2025-11-29 01:33:22.952950882 +0000 UTC m=+1346.125100739" observedRunningTime="2025-11-29 01:33:23.840970641 +0000 UTC m=+1347.013120498" watchObservedRunningTime="2025-11-29 01:33:23.849377811 +0000 UTC m=+1347.021527668" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.850153 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" podStartSLOduration=2.520929383 podStartE2EDuration="4.85014671s" podCreationTimestamp="2025-11-29 01:33:19 +0000 UTC" firstStartedPulling="2025-11-29 01:33:20.619755376 +0000 UTC m=+1343.791905223" lastFinishedPulling="2025-11-29 01:33:22.948972693 +0000 UTC m=+1346.121122550" observedRunningTime="2025-11-29 01:33:23.811583298 +0000 UTC m=+1346.983733155" watchObservedRunningTime="2025-11-29 01:33:23.85014671 +0000 UTC m=+1347.022296567" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.873762 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-public-tls-certs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.873834 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.873896 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-internal-tls-certs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.873924 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e422f911-d2a1-48ac-9ad7-9394647ad23c-logs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.874006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data-custom\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.874159 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72h6\" (UniqueName: \"kubernetes.io/projected/e422f911-d2a1-48ac-9ad7-9394647ad23c-kube-api-access-w72h6\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.874183 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-combined-ca-bundle\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.975336 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-internal-tls-certs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.976188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e422f911-d2a1-48ac-9ad7-9394647ad23c-logs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.976318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data-custom\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.976564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72h6\" (UniqueName: \"kubernetes.io/projected/e422f911-d2a1-48ac-9ad7-9394647ad23c-kube-api-access-w72h6\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.976710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e422f911-d2a1-48ac-9ad7-9394647ad23c-logs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.977504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-combined-ca-bundle\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.977757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-public-tls-certs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.977806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.986712 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-internal-tls-certs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.988045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-combined-ca-bundle\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.994839 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data-custom\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.994906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-public-tls-certs\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.995600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:23 crc kubenswrapper[4749]: I1129 01:33:23.998399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72h6\" (UniqueName: \"kubernetes.io/projected/e422f911-d2a1-48ac-9ad7-9394647ad23c-kube-api-access-w72h6\") pod \"barbican-api-744d76c7bb-6xh5q\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:24 crc kubenswrapper[4749]: I1129 01:33:24.062076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:24 crc kubenswrapper[4749]: I1129 01:33:24.564008 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-744d76c7bb-6xh5q"] Nov 29 01:33:24 crc kubenswrapper[4749]: I1129 01:33:24.796252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d76c7bb-6xh5q" event={"ID":"e422f911-d2a1-48ac-9ad7-9394647ad23c","Type":"ContainerStarted","Data":"d7b0c26bec26e466ebc66b267c17b31bec562429e5b71d626ce13e2764f2a0c0"} Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.374707 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.375010 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.375052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.375772 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"78a46fc2167fe8c7a63102b9ca82268fb546ed6ba88ac6008b9b767e900c3b97"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.375848 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://78a46fc2167fe8c7a63102b9ca82268fb546ed6ba88ac6008b9b767e900c3b97" gracePeriod=600 Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.809739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d76c7bb-6xh5q" event={"ID":"e422f911-d2a1-48ac-9ad7-9394647ad23c","Type":"ContainerStarted","Data":"02e31a093f3745f70d2f114e3071dcc2ff626d6fa46cc35da2f370cfb31b8cce"} Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.810142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d76c7bb-6xh5q" event={"ID":"e422f911-d2a1-48ac-9ad7-9394647ad23c","Type":"ContainerStarted","Data":"405921777b6096485df779e9382966efbbcbdbe565e378910a31ae009c151eab"} Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.810171 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.815312 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="78a46fc2167fe8c7a63102b9ca82268fb546ed6ba88ac6008b9b767e900c3b97" exitCode=0 Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.815862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"78a46fc2167fe8c7a63102b9ca82268fb546ed6ba88ac6008b9b767e900c3b97"} Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.816041 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0"} Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.816112 4749 scope.go:117] "RemoveContainer" containerID="6e6c7dec3b1649e653ef737530df27b983a2221104d91371e3560585b54c93a8" Nov 29 01:33:25 crc kubenswrapper[4749]: I1129 01:33:25.835036 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-744d76c7bb-6xh5q" podStartSLOduration=2.835016592 podStartE2EDuration="2.835016592s" podCreationTimestamp="2025-11-29 01:33:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:25.82691183 +0000 UTC m=+1348.999061687" watchObservedRunningTime="2025-11-29 01:33:25.835016592 +0000 UTC m=+1349.007166449" Nov 29 01:33:26 crc kubenswrapper[4749]: I1129 01:33:26.838431 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:27 crc kubenswrapper[4749]: I1129 01:33:27.848134 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ab7557b-b040-450a-b0ad-437720fab3a2" containerID="89b878f4648a1b7a255f4ce926f9a55d53da0950d5f663bde6352f702c375d11" exitCode=0 Nov 29 01:33:27 crc kubenswrapper[4749]: I1129 01:33:27.848998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rj68d" event={"ID":"2ab7557b-b040-450a-b0ad-437720fab3a2","Type":"ContainerDied","Data":"89b878f4648a1b7a255f4ce926f9a55d53da0950d5f663bde6352f702c375d11"} Nov 29 01:33:30 crc kubenswrapper[4749]: I1129 01:33:30.276589 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:30 crc kubenswrapper[4749]: I1129 01:33:30.354089 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bzvtk"] Nov 29 01:33:30 crc kubenswrapper[4749]: I1129 01:33:30.354384 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" podUID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerName="dnsmasq-dns" containerID="cri-o://3beff1d10e953e55a3b106b089faaec16567d6ed10bb480994464b9f19994b2d" gracePeriod=10 Nov 29 01:33:31 crc kubenswrapper[4749]: I1129 01:33:31.392254 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:31 crc kubenswrapper[4749]: I1129 01:33:31.887235 4749 generic.go:334] "Generic (PLEG): container finished" podID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerID="3beff1d10e953e55a3b106b089faaec16567d6ed10bb480994464b9f19994b2d" exitCode=0 Nov 29 01:33:31 crc kubenswrapper[4749]: I1129 01:33:31.887283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" event={"ID":"7eee8ed5-a3fb-4460-8b03-337b30a8dd61","Type":"ContainerDied","Data":"3beff1d10e953e55a3b106b089faaec16567d6ed10bb480994464b9f19994b2d"} Nov 29 01:33:32 crc kubenswrapper[4749]: I1129 01:33:32.021181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:32 crc kubenswrapper[4749]: I1129 01:33:32.042711 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" podUID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Nov 29 01:33:32 crc kubenswrapper[4749]: I1129 01:33:32.150240 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:32 crc kubenswrapper[4749]: I1129 01:33:32.962374 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.053386 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-546b495778-5rd58"] Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.101531 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rj68d" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.139442 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d64dc\" (UniqueName: \"kubernetes.io/projected/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-kube-api-access-d64dc\") pod \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185782 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-nb\") pod \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-scripts\") pod \"2ab7557b-b040-450a-b0ad-437720fab3a2\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-sb\") pod \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185883 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-db-sync-config-data\") pod \"2ab7557b-b040-450a-b0ad-437720fab3a2\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185909 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-svc\") pod \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185929 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-swift-storage-0\") pod \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqwjv\" (UniqueName: \"kubernetes.io/projected/2ab7557b-b040-450a-b0ad-437720fab3a2-kube-api-access-nqwjv\") pod \"2ab7557b-b040-450a-b0ad-437720fab3a2\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-combined-ca-bundle\") pod \"2ab7557b-b040-450a-b0ad-437720fab3a2\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.185985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-config-data\") pod \"2ab7557b-b040-450a-b0ad-437720fab3a2\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.186011 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ab7557b-b040-450a-b0ad-437720fab3a2-etc-machine-id\") pod \"2ab7557b-b040-450a-b0ad-437720fab3a2\" (UID: \"2ab7557b-b040-450a-b0ad-437720fab3a2\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.186031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-config\") pod \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\" (UID: \"7eee8ed5-a3fb-4460-8b03-337b30a8dd61\") " Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.197414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ab7557b-b040-450a-b0ad-437720fab3a2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2ab7557b-b040-450a-b0ad-437720fab3a2" (UID: "2ab7557b-b040-450a-b0ad-437720fab3a2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.204631 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab7557b-b040-450a-b0ad-437720fab3a2-kube-api-access-nqwjv" (OuterVolumeSpecName: "kube-api-access-nqwjv") pod "2ab7557b-b040-450a-b0ad-437720fab3a2" (UID: "2ab7557b-b040-450a-b0ad-437720fab3a2"). InnerVolumeSpecName "kube-api-access-nqwjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.227428 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-scripts" (OuterVolumeSpecName: "scripts") pod "2ab7557b-b040-450a-b0ad-437720fab3a2" (UID: "2ab7557b-b040-450a-b0ad-437720fab3a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.227566 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-kube-api-access-d64dc" (OuterVolumeSpecName: "kube-api-access-d64dc") pod "7eee8ed5-a3fb-4460-8b03-337b30a8dd61" (UID: "7eee8ed5-a3fb-4460-8b03-337b30a8dd61"). InnerVolumeSpecName "kube-api-access-d64dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.227591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2ab7557b-b040-450a-b0ad-437720fab3a2" (UID: "2ab7557b-b040-450a-b0ad-437720fab3a2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.290439 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqwjv\" (UniqueName: \"kubernetes.io/projected/2ab7557b-b040-450a-b0ad-437720fab3a2-kube-api-access-nqwjv\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.290474 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ab7557b-b040-450a-b0ad-437720fab3a2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.290485 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d64dc\" (UniqueName: \"kubernetes.io/projected/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-kube-api-access-d64dc\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.290493 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.290502 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.294604 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ab7557b-b040-450a-b0ad-437720fab3a2" (UID: "2ab7557b-b040-450a-b0ad-437720fab3a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.301387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-config" (OuterVolumeSpecName: "config") pod "7eee8ed5-a3fb-4460-8b03-337b30a8dd61" (UID: "7eee8ed5-a3fb-4460-8b03-337b30a8dd61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.312326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-config-data" (OuterVolumeSpecName: "config-data") pod "2ab7557b-b040-450a-b0ad-437720fab3a2" (UID: "2ab7557b-b040-450a-b0ad-437720fab3a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.317955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7eee8ed5-a3fb-4460-8b03-337b30a8dd61" (UID: "7eee8ed5-a3fb-4460-8b03-337b30a8dd61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.333729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7eee8ed5-a3fb-4460-8b03-337b30a8dd61" (UID: "7eee8ed5-a3fb-4460-8b03-337b30a8dd61"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.335960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7eee8ed5-a3fb-4460-8b03-337b30a8dd61" (UID: "7eee8ed5-a3fb-4460-8b03-337b30a8dd61"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.379698 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7eee8ed5-a3fb-4460-8b03-337b30a8dd61" (UID: "7eee8ed5-a3fb-4460-8b03-337b30a8dd61"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.397293 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.398402 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.398426 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.398439 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.398455 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7eee8ed5-a3fb-4460-8b03-337b30a8dd61-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.398466 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.398474 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ab7557b-b040-450a-b0ad-437720fab3a2-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:33 crc kubenswrapper[4749]: E1129 01:33:33.538765 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.915472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" event={"ID":"7eee8ed5-a3fb-4460-8b03-337b30a8dd61","Type":"ContainerDied","Data":"6a437d33013501e2accd706f0f94d6d9dd39be2c618d10f039634c4c4f469544"} Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.915539 4749 scope.go:117] "RemoveContainer" containerID="3beff1d10e953e55a3b106b089faaec16567d6ed10bb480994464b9f19994b2d" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.915535 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bzvtk" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.918053 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rj68d" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.918077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rj68d" event={"ID":"2ab7557b-b040-450a-b0ad-437720fab3a2","Type":"ContainerDied","Data":"4af4b725638963a5e22f9788f575f26d953d1ee9b6b5e019c867addaa01bef16"} Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.918170 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af4b725638963a5e22f9788f575f26d953d1ee9b6b5e019c867addaa01bef16" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.928390 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf","Type":"ContainerStarted","Data":"d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f"} Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.928442 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-546b495778-5rd58" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api-log" containerID="cri-o://30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e" gracePeriod=30 Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.928522 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="ceilometer-notification-agent" containerID="cri-o://9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601" gracePeriod=30 Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.928533 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-546b495778-5rd58" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api" containerID="cri-o://5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53" gracePeriod=30 Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.928638 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="proxy-httpd" containerID="cri-o://d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f" gracePeriod=30 Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.928692 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="sg-core" containerID="cri-o://12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53" gracePeriod=30 Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.937450 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-546b495778-5rd58" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": EOF" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.957727 4749 scope.go:117] "RemoveContainer" containerID="eca228af450820e842a5822f41fa9aa8f60e1ad79808b3b12e63ff431558e7c6" Nov 29 01:33:33 crc kubenswrapper[4749]: I1129 01:33:33.992058 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bzvtk"] Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.002434 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bzvtk"] Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.302506 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:34 crc kubenswrapper[4749]: E1129 01:33:34.302973 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerName="init" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.302991 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerName="init" Nov 29 01:33:34 crc kubenswrapper[4749]: E1129 01:33:34.303019 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab7557b-b040-450a-b0ad-437720fab3a2" containerName="cinder-db-sync" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.303027 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab7557b-b040-450a-b0ad-437720fab3a2" containerName="cinder-db-sync" Nov 29 01:33:34 crc kubenswrapper[4749]: E1129 01:33:34.303041 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerName="dnsmasq-dns" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.303048 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerName="dnsmasq-dns" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.303261 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab7557b-b040-450a-b0ad-437720fab3a2" containerName="cinder-db-sync" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.303288 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" containerName="dnsmasq-dns" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.304301 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.315916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.316006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-scripts\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.316032 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.316058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.316085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.316123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4sf\" (UniqueName: \"kubernetes.io/projected/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-kube-api-access-9x4sf\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.316677 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.316877 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.317005 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.326336 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bkkgq" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.332250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.363995 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-q4zcf"] Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.365442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.417740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.417801 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.417828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.417851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.417880 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4sf\" (UniqueName: \"kubernetes.io/projected/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-kube-api-access-9x4sf\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.417904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nptw7\" (UniqueName: \"kubernetes.io/projected/a69dc8b1-6d28-4872-97d2-471104e468fe-kube-api-access-nptw7\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.417954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.417986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.418034 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-config\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.418067 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-scripts\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.418092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.418108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.419223 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.426380 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.434781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-scripts\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.435146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.439767 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-q4zcf"] Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.442013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.444091 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4sf\" (UniqueName: \"kubernetes.io/projected/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-kube-api-access-9x4sf\") pod \"cinder-scheduler-0\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.519288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.519367 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-config\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.519419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.519447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.519466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.519506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nptw7\" (UniqueName: \"kubernetes.io/projected/a69dc8b1-6d28-4872-97d2-471104e468fe-kube-api-access-nptw7\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.520517 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.522248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.522828 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.523585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.524028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-config\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.543388 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nptw7\" (UniqueName: \"kubernetes.io/projected/a69dc8b1-6d28-4872-97d2-471104e468fe-kube-api-access-nptw7\") pod \"dnsmasq-dns-5c9776ccc5-q4zcf\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.557419 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.559321 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.561218 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.572037 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.621429 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.728354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.728411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf119add-6e90-483e-ae23-4244562fda61-logs\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.728462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.728509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95g4w\" (UniqueName: \"kubernetes.io/projected/bf119add-6e90-483e-ae23-4244562fda61-kube-api-access-95g4w\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.728545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-scripts\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.728599 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf119add-6e90-483e-ae23-4244562fda61-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.728634 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.812719 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.831362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-scripts\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.831446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf119add-6e90-483e-ae23-4244562fda61-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.831473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.831532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.831553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf119add-6e90-483e-ae23-4244562fda61-logs\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.831583 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.831606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95g4w\" (UniqueName: \"kubernetes.io/projected/bf119add-6e90-483e-ae23-4244562fda61-kube-api-access-95g4w\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.833678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf119add-6e90-483e-ae23-4244562fda61-logs\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.833758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf119add-6e90-483e-ae23-4244562fda61-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.841120 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-scripts\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.841293 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.845944 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data-custom\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.846063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.873876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95g4w\" (UniqueName: \"kubernetes.io/projected/bf119add-6e90-483e-ae23-4244562fda61-kube-api-access-95g4w\") pod \"cinder-api-0\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " pod="openstack/cinder-api-0" Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.999010 4749 generic.go:334] "Generic (PLEG): container finished" podID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerID="d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f" exitCode=0 Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.999048 4749 generic.go:334] "Generic (PLEG): container finished" podID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerID="12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53" exitCode=2 Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.999089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf","Type":"ContainerDied","Data":"d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f"} Nov 29 01:33:34 crc kubenswrapper[4749]: I1129 01:33:34.999136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf","Type":"ContainerDied","Data":"12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53"} Nov 29 01:33:35 crc kubenswrapper[4749]: I1129 01:33:35.002885 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 01:33:35 crc kubenswrapper[4749]: I1129 01:33:35.029573 4749 generic.go:334] "Generic (PLEG): container finished" podID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerID="30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e" exitCode=143 Nov 29 01:33:35 crc kubenswrapper[4749]: I1129 01:33:35.029617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546b495778-5rd58" event={"ID":"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5","Type":"ContainerDied","Data":"30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e"} Nov 29 01:33:35 crc kubenswrapper[4749]: I1129 01:33:35.090721 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eee8ed5-a3fb-4460-8b03-337b30a8dd61" path="/var/lib/kubelet/pods/7eee8ed5-a3fb-4460-8b03-337b30a8dd61/volumes" Nov 29 01:33:35 crc kubenswrapper[4749]: I1129 01:33:35.168436 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:35 crc kubenswrapper[4749]: I1129 01:33:35.515844 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:35 crc kubenswrapper[4749]: W1129 01:33:35.530387 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf119add_6e90_483e_ae23_4244562fda61.slice/crio-4be056d7b0b7443a3ba1547c617aa348761f4fe718f1a62dee2c89b812cc296d WatchSource:0}: Error finding container 4be056d7b0b7443a3ba1547c617aa348761f4fe718f1a62dee2c89b812cc296d: Status 404 returned error can't find the container with id 4be056d7b0b7443a3ba1547c617aa348761f4fe718f1a62dee2c89b812cc296d Nov 29 01:33:35 crc kubenswrapper[4749]: I1129 01:33:35.637069 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-q4zcf"] Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.013709 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.042663 4749 generic.go:334] "Generic (PLEG): container finished" podID="a69dc8b1-6d28-4872-97d2-471104e468fe" containerID="80245bf1375be77093f163d2d36a22b8e4b81e8417dc9026378a606ed08f2aa8" exitCode=0 Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.042965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" event={"ID":"a69dc8b1-6d28-4872-97d2-471104e468fe","Type":"ContainerDied","Data":"80245bf1375be77093f163d2d36a22b8e4b81e8417dc9026378a606ed08f2aa8"} Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.042991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" event={"ID":"a69dc8b1-6d28-4872-97d2-471104e468fe","Type":"ContainerStarted","Data":"371d0bdc441b70ecb5a37239f0f99e3a5330ac29913c720f62e1879093bdc7e0"} Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.051339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf119add-6e90-483e-ae23-4244562fda61","Type":"ContainerStarted","Data":"4be056d7b0b7443a3ba1547c617aa348761f4fe718f1a62dee2c89b812cc296d"} Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.053347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae","Type":"ContainerStarted","Data":"0eff4e655a00dd52c78698d9c211611f8a703de29c32131baa460e14a0607551"} Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.057766 4749 generic.go:334] "Generic (PLEG): container finished" podID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerID="9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601" exitCode=0 Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.057798 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf","Type":"ContainerDied","Data":"9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601"} Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.057816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf","Type":"ContainerDied","Data":"e4e07575f031401e6ef44dda63a148367d05c06bc1edad2f9b8db12771f61b40"} Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.057832 4749 scope.go:117] "RemoveContainer" containerID="d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.057948 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.115878 4749 scope.go:117] "RemoveContainer" containerID="12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.158398 4749 scope.go:117] "RemoveContainer" containerID="9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.185674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-log-httpd\") pod \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.185758 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-combined-ca-bundle\") pod \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.185804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-run-httpd\") pod \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.185829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-config-data\") pod \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.185859 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-sg-core-conf-yaml\") pod \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.185913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t4h4\" (UniqueName: \"kubernetes.io/projected/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-kube-api-access-9t4h4\") pod \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.185945 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-scripts\") pod \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\" (UID: \"21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf\") " Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.187285 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" (UID: "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.192292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" (UID: "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.199432 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-kube-api-access-9t4h4" (OuterVolumeSpecName: "kube-api-access-9t4h4") pod "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" (UID: "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf"). InnerVolumeSpecName "kube-api-access-9t4h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.201547 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-scripts" (OuterVolumeSpecName: "scripts") pod "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" (UID: "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.201578 4749 scope.go:117] "RemoveContainer" containerID="d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f" Nov 29 01:33:36 crc kubenswrapper[4749]: E1129 01:33:36.202819 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f\": container with ID starting with d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f not found: ID does not exist" containerID="d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.202863 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f"} err="failed to get container status \"d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f\": rpc error: code = NotFound desc = could not find container \"d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f\": container with ID starting with d3e0be21be68c4e9761cce180c62d0fa0b4c894dc444e17c7cb9c76b938bdb2f not found: ID does not exist" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.202895 4749 scope.go:117] "RemoveContainer" containerID="12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53" Nov 29 01:33:36 crc kubenswrapper[4749]: E1129 01:33:36.203410 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53\": container with ID starting with 12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53 not found: ID does not exist" containerID="12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.203459 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53"} err="failed to get container status \"12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53\": rpc error: code = NotFound desc = could not find container \"12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53\": container with ID starting with 12cc87ce2325ad1f25697595b5bfabfbb938a70f033319512b3a6c378e07fc53 not found: ID does not exist" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.203488 4749 scope.go:117] "RemoveContainer" containerID="9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601" Nov 29 01:33:36 crc kubenswrapper[4749]: E1129 01:33:36.204788 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601\": container with ID starting with 9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601 not found: ID does not exist" containerID="9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.204819 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601"} err="failed to get container status \"9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601\": rpc error: code = NotFound desc = could not find container \"9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601\": container with ID starting with 9eda7d070a9b49d309641160005a5b457d44d9a2740233441bece1eadac47601 not found: ID does not exist" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.242795 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" (UID: "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.288931 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.288960 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t4h4\" (UniqueName: \"kubernetes.io/projected/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-kube-api-access-9t4h4\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.288970 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.288980 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.289007 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.296100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" (UID: "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.309032 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-config-data" (OuterVolumeSpecName: "config-data") pod "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" (UID: "21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.404967 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.404997 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.450460 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.455317 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.495426 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:33:36 crc kubenswrapper[4749]: E1129 01:33:36.495841 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="proxy-httpd" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.495859 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="proxy-httpd" Nov 29 01:33:36 crc kubenswrapper[4749]: E1129 01:33:36.495890 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="sg-core" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.495900 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="sg-core" Nov 29 01:33:36 crc kubenswrapper[4749]: E1129 01:33:36.495909 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="ceilometer-notification-agent" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.495915 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="ceilometer-notification-agent" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.496094 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="proxy-httpd" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.496121 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="sg-core" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.496136 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" containerName="ceilometer-notification-agent" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.497789 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.504777 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.507176 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-scripts\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.507232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-config-data\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.507264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg87q\" (UniqueName: \"kubernetes.io/projected/f1055f93-b59a-482b-86f7-d6109aa96abe-kube-api-access-fg87q\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.507324 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-run-httpd\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.507344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.507406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-log-httpd\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.507422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.507860 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.508096 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.612739 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-log-httpd\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.612812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.612943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-scripts\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.612964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-config-data\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.612988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg87q\" (UniqueName: \"kubernetes.io/projected/f1055f93-b59a-482b-86f7-d6109aa96abe-kube-api-access-fg87q\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.613051 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-run-httpd\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.613069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.617906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-run-httpd\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.618385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-log-httpd\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.619269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-scripts\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.622146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.630599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-config-data\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.638522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg87q\" (UniqueName: \"kubernetes.io/projected/f1055f93-b59a-482b-86f7-d6109aa96abe-kube-api-access-fg87q\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.640541 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " pod="openstack/ceilometer-0" Nov 29 01:33:36 crc kubenswrapper[4749]: I1129 01:33:36.838924 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.120037 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf" path="/var/lib/kubelet/pods/21c7fb80-fb03-4e0a-85f3-c5dc56d3ccdf/volumes" Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.121918 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.121940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf119add-6e90-483e-ae23-4244562fda61","Type":"ContainerStarted","Data":"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3"} Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.121959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf119add-6e90-483e-ae23-4244562fda61","Type":"ContainerStarted","Data":"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073"} Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.121971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae","Type":"ContainerStarted","Data":"ae0ed88a1c6160707e84533438149cec007bc4dee7f1326ad7db99e376e12201"} Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.131552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" event={"ID":"a69dc8b1-6d28-4872-97d2-471104e468fe","Type":"ContainerStarted","Data":"e5bed771bb234e6165f9372ee25f274f50c7368489100dfd707cbe3e43b234ea"} Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.132087 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.176556 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.183405 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.218369 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.218349472 podStartE2EDuration="3.218349472s" podCreationTimestamp="2025-11-29 01:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:37.215793678 +0000 UTC m=+1360.387943535" watchObservedRunningTime="2025-11-29 01:33:37.218349472 +0000 UTC m=+1360.390499329" Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.255572 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" podStartSLOduration=3.25554904 podStartE2EDuration="3.25554904s" podCreationTimestamp="2025-11-29 01:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:37.244641018 +0000 UTC m=+1360.416790885" watchObservedRunningTime="2025-11-29 01:33:37.25554904 +0000 UTC m=+1360.427698897" Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.378459 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.463430 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-546b495778-5rd58" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:56756->10.217.0.154:9311: read: connection reset by peer" Nov 29 01:33:37 crc kubenswrapper[4749]: I1129 01:33:37.463691 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-546b495778-5rd58" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.154:9311/healthcheck\": read tcp 10.217.0.2:56754->10.217.0.154:9311: read: connection reset by peer" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.025479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.146653 4749 generic.go:334] "Generic (PLEG): container finished" podID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerID="5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53" exitCode=0 Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.146792 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-546b495778-5rd58" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.147499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546b495778-5rd58" event={"ID":"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5","Type":"ContainerDied","Data":"5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53"} Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.147524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546b495778-5rd58" event={"ID":"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5","Type":"ContainerDied","Data":"623531fb3e2b980d32733388fd650c87dc04a8cf9aa5a6100b6d0ed40e63631c"} Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.147550 4749 scope.go:117] "RemoveContainer" containerID="5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.152303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae","Type":"ContainerStarted","Data":"67b7b0e6a86da6f669181bbb9bcede7fecb0bbe4690fbdc042440d3d829e937f"} Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.155801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerStarted","Data":"ca4083126ea2ef9365a012ae85da1e24a4bd62db04b2266ea14beab6316e936c"} Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.173262 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.1919838130000002 podStartE2EDuration="4.17324767s" podCreationTimestamp="2025-11-29 01:33:34 +0000 UTC" firstStartedPulling="2025-11-29 01:33:35.187515853 +0000 UTC m=+1358.359665710" lastFinishedPulling="2025-11-29 01:33:36.16877971 +0000 UTC m=+1359.340929567" observedRunningTime="2025-11-29 01:33:38.172911082 +0000 UTC m=+1361.345060939" watchObservedRunningTime="2025-11-29 01:33:38.17324767 +0000 UTC m=+1361.345397527" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.175804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zcrw\" (UniqueName: \"kubernetes.io/projected/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-kube-api-access-9zcrw\") pod \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.175851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data-custom\") pod \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.175928 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data\") pod \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.176015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-combined-ca-bundle\") pod \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.176082 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-logs\") pod \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\" (UID: \"bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5\") " Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.177036 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-logs" (OuterVolumeSpecName: "logs") pod "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" (UID: "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.180499 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-kube-api-access-9zcrw" (OuterVolumeSpecName: "kube-api-access-9zcrw") pod "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" (UID: "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5"). InnerVolumeSpecName "kube-api-access-9zcrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.183744 4749 scope.go:117] "RemoveContainer" containerID="30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.183779 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" (UID: "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.213499 4749 scope.go:117] "RemoveContainer" containerID="5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53" Nov 29 01:33:38 crc kubenswrapper[4749]: E1129 01:33:38.213897 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53\": container with ID starting with 5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53 not found: ID does not exist" containerID="5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.213940 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53"} err="failed to get container status \"5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53\": rpc error: code = NotFound desc = could not find container \"5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53\": container with ID starting with 5a8a4e35fc08d66c5584e9c5c1888d70c83aa6b3afae9d0617ae73033e3e3f53 not found: ID does not exist" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.213960 4749 scope.go:117] "RemoveContainer" containerID="30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e" Nov 29 01:33:38 crc kubenswrapper[4749]: E1129 01:33:38.214149 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e\": container with ID starting with 30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e not found: ID does not exist" containerID="30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.214164 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e"} err="failed to get container status \"30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e\": rpc error: code = NotFound desc = could not find container \"30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e\": container with ID starting with 30af0c7d6d87b0ececb483fb58265ec3b05eb200ddcb5bbfed54332edf13ca0e not found: ID does not exist" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.232188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" (UID: "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.261080 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data" (OuterVolumeSpecName: "config-data") pod "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" (UID: "bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.278386 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zcrw\" (UniqueName: \"kubernetes.io/projected/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-kube-api-access-9zcrw\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.278438 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.278451 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.278460 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.278468 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.486173 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-546b495778-5rd58"] Nov 29 01:33:38 crc kubenswrapper[4749]: I1129 01:33:38.496859 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-546b495778-5rd58"] Nov 29 01:33:39 crc kubenswrapper[4749]: I1129 01:33:39.088089 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" path="/var/lib/kubelet/pods/bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5/volumes" Nov 29 01:33:39 crc kubenswrapper[4749]: I1129 01:33:39.166976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerStarted","Data":"8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff"} Nov 29 01:33:39 crc kubenswrapper[4749]: I1129 01:33:39.167032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerStarted","Data":"d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce"} Nov 29 01:33:39 crc kubenswrapper[4749]: I1129 01:33:39.173376 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bf119add-6e90-483e-ae23-4244562fda61" containerName="cinder-api-log" containerID="cri-o://321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073" gracePeriod=30 Nov 29 01:33:39 crc kubenswrapper[4749]: I1129 01:33:39.173996 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bf119add-6e90-483e-ae23-4244562fda61" containerName="cinder-api" containerID="cri-o://028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3" gracePeriod=30 Nov 29 01:33:39 crc kubenswrapper[4749]: I1129 01:33:39.870836 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.055443 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.134797 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79cf68bf7b-r88sb"] Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.135050 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79cf68bf7b-r88sb" podUID="436900b7-4647-44d8-8f0f-6e6077747800" containerName="neutron-api" containerID="cri-o://92274068016fa71ba48799703446fbcf030ea448fcc91ca833ddc199876bd97f" gracePeriod=30 Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.135178 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79cf68bf7b-r88sb" podUID="436900b7-4647-44d8-8f0f-6e6077747800" containerName="neutron-httpd" containerID="cri-o://378ad2fd5a8829a92d3577ec6ec91a92b3e091baf09191f73f6dd468ff370b3c" gracePeriod=30 Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.142499 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.218628 4749 generic.go:334] "Generic (PLEG): container finished" podID="bf119add-6e90-483e-ae23-4244562fda61" containerID="028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3" exitCode=0 Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.218662 4749 generic.go:334] "Generic (PLEG): container finished" podID="bf119add-6e90-483e-ae23-4244562fda61" containerID="321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073" exitCode=143 Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.218779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.218871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf119add-6e90-483e-ae23-4244562fda61","Type":"ContainerDied","Data":"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3"} Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.218901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf119add-6e90-483e-ae23-4244562fda61","Type":"ContainerDied","Data":"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073"} Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.218912 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bf119add-6e90-483e-ae23-4244562fda61","Type":"ContainerDied","Data":"4be056d7b0b7443a3ba1547c617aa348761f4fe718f1a62dee2c89b812cc296d"} Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.218928 4749 scope.go:117] "RemoveContainer" containerID="028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.246638 4749 scope.go:117] "RemoveContainer" containerID="321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.273721 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-combined-ca-bundle\") pod \"bf119add-6e90-483e-ae23-4244562fda61\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.273796 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data\") pod \"bf119add-6e90-483e-ae23-4244562fda61\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.273860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data-custom\") pod \"bf119add-6e90-483e-ae23-4244562fda61\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.273893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf119add-6e90-483e-ae23-4244562fda61-etc-machine-id\") pod \"bf119add-6e90-483e-ae23-4244562fda61\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.274023 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95g4w\" (UniqueName: \"kubernetes.io/projected/bf119add-6e90-483e-ae23-4244562fda61-kube-api-access-95g4w\") pod \"bf119add-6e90-483e-ae23-4244562fda61\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.274054 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-scripts\") pod \"bf119add-6e90-483e-ae23-4244562fda61\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.274073 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf119add-6e90-483e-ae23-4244562fda61-logs\") pod \"bf119add-6e90-483e-ae23-4244562fda61\" (UID: \"bf119add-6e90-483e-ae23-4244562fda61\") " Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.276071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf119add-6e90-483e-ae23-4244562fda61-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bf119add-6e90-483e-ae23-4244562fda61" (UID: "bf119add-6e90-483e-ae23-4244562fda61"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.276969 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf119add-6e90-483e-ae23-4244562fda61-logs" (OuterVolumeSpecName: "logs") pod "bf119add-6e90-483e-ae23-4244562fda61" (UID: "bf119add-6e90-483e-ae23-4244562fda61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.277357 4749 scope.go:117] "RemoveContainer" containerID="028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3" Nov 29 01:33:40 crc kubenswrapper[4749]: E1129 01:33:40.281677 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3\": container with ID starting with 028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3 not found: ID does not exist" containerID="028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.281725 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3"} err="failed to get container status \"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3\": rpc error: code = NotFound desc = could not find container \"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3\": container with ID starting with 028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3 not found: ID does not exist" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.281780 4749 scope.go:117] "RemoveContainer" containerID="321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.281788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf119add-6e90-483e-ae23-4244562fda61-kube-api-access-95g4w" (OuterVolumeSpecName: "kube-api-access-95g4w") pod "bf119add-6e90-483e-ae23-4244562fda61" (UID: "bf119add-6e90-483e-ae23-4244562fda61"). InnerVolumeSpecName "kube-api-access-95g4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.281898 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bf119add-6e90-483e-ae23-4244562fda61" (UID: "bf119add-6e90-483e-ae23-4244562fda61"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.282241 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-scripts" (OuterVolumeSpecName: "scripts") pod "bf119add-6e90-483e-ae23-4244562fda61" (UID: "bf119add-6e90-483e-ae23-4244562fda61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:40 crc kubenswrapper[4749]: E1129 01:33:40.283437 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073\": container with ID starting with 321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073 not found: ID does not exist" containerID="321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.283568 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073"} err="failed to get container status \"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073\": rpc error: code = NotFound desc = could not find container \"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073\": container with ID starting with 321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073 not found: ID does not exist" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.283689 4749 scope.go:117] "RemoveContainer" containerID="028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.284272 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3"} err="failed to get container status \"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3\": rpc error: code = NotFound desc = could not find container \"028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3\": container with ID starting with 028d0c08a58a86fa00f9e08347e32c5efea83cf3e28e9ff232501ecc819fbec3 not found: ID does not exist" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.284334 4749 scope.go:117] "RemoveContainer" containerID="321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.284668 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073"} err="failed to get container status \"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073\": rpc error: code = NotFound desc = could not find container \"321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073\": container with ID starting with 321fd462b99c1da44007e352d7607920e2eb3adeb898800ab0f7efe6e059c073 not found: ID does not exist" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.317557 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf119add-6e90-483e-ae23-4244562fda61" (UID: "bf119add-6e90-483e-ae23-4244562fda61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.356430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data" (OuterVolumeSpecName: "config-data") pod "bf119add-6e90-483e-ae23-4244562fda61" (UID: "bf119add-6e90-483e-ae23-4244562fda61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.376616 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf119add-6e90-483e-ae23-4244562fda61-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.376647 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.376658 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.376667 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.376675 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf119add-6e90-483e-ae23-4244562fda61-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.376683 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95g4w\" (UniqueName: \"kubernetes.io/projected/bf119add-6e90-483e-ae23-4244562fda61-kube-api-access-95g4w\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.376691 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf119add-6e90-483e-ae23-4244562fda61-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.586357 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.606623 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.615048 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:40 crc kubenswrapper[4749]: E1129 01:33:40.615467 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf119add-6e90-483e-ae23-4244562fda61" containerName="cinder-api-log" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.615541 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf119add-6e90-483e-ae23-4244562fda61" containerName="cinder-api-log" Nov 29 01:33:40 crc kubenswrapper[4749]: E1129 01:33:40.615597 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.615645 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api" Nov 29 01:33:40 crc kubenswrapper[4749]: E1129 01:33:40.615704 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api-log" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.615757 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api-log" Nov 29 01:33:40 crc kubenswrapper[4749]: E1129 01:33:40.615817 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf119add-6e90-483e-ae23-4244562fda61" containerName="cinder-api" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.615868 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf119add-6e90-483e-ae23-4244562fda61" containerName="cinder-api" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.616083 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.616155 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf119add-6e90-483e-ae23-4244562fda61" containerName="cinder-api" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.616226 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc02fc7-2dd5-44ea-9dce-75b4f1009fe5" containerName="barbican-api-log" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.616283 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf119add-6e90-483e-ae23-4244562fda61" containerName="cinder-api-log" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.617385 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.622129 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.622481 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.622620 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.632607 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.783677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data-custom\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.783761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-scripts\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.783850 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k25qb\" (UniqueName: \"kubernetes.io/projected/8715ecba-f08b-4fcd-b129-d9e9c568e087-kube-api-access-k25qb\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.783963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715ecba-f08b-4fcd-b129-d9e9c568e087-logs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.784018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8715ecba-f08b-4fcd-b129-d9e9c568e087-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.784082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.784125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.784189 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.784251 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.884731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.884775 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.884809 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.884835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.884862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data-custom\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.884885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-scripts\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.884918 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k25qb\" (UniqueName: \"kubernetes.io/projected/8715ecba-f08b-4fcd-b129-d9e9c568e087-kube-api-access-k25qb\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.884973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715ecba-f08b-4fcd-b129-d9e9c568e087-logs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.885004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8715ecba-f08b-4fcd-b129-d9e9c568e087-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.885083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8715ecba-f08b-4fcd-b129-d9e9c568e087-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.885718 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715ecba-f08b-4fcd-b129-d9e9c568e087-logs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.889931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.891110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-scripts\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.891727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data-custom\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.892369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.892526 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.900634 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.910631 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k25qb\" (UniqueName: \"kubernetes.io/projected/8715ecba-f08b-4fcd-b129-d9e9c568e087-kube-api-access-k25qb\") pod \"cinder-api-0\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " pod="openstack/cinder-api-0" Nov 29 01:33:40 crc kubenswrapper[4749]: I1129 01:33:40.938842 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 01:33:41 crc kubenswrapper[4749]: I1129 01:33:41.095081 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf119add-6e90-483e-ae23-4244562fda61" path="/var/lib/kubelet/pods/bf119add-6e90-483e-ae23-4244562fda61/volumes" Nov 29 01:33:41 crc kubenswrapper[4749]: I1129 01:33:41.231236 4749 generic.go:334] "Generic (PLEG): container finished" podID="436900b7-4647-44d8-8f0f-6e6077747800" containerID="378ad2fd5a8829a92d3577ec6ec91a92b3e091baf09191f73f6dd468ff370b3c" exitCode=0 Nov 29 01:33:41 crc kubenswrapper[4749]: I1129 01:33:41.231304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cf68bf7b-r88sb" event={"ID":"436900b7-4647-44d8-8f0f-6e6077747800","Type":"ContainerDied","Data":"378ad2fd5a8829a92d3577ec6ec91a92b3e091baf09191f73f6dd468ff370b3c"} Nov 29 01:33:41 crc kubenswrapper[4749]: I1129 01:33:41.233567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerStarted","Data":"0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1"} Nov 29 01:33:41 crc kubenswrapper[4749]: I1129 01:33:41.403471 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:33:41 crc kubenswrapper[4749]: I1129 01:33:41.873643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:41 crc kubenswrapper[4749]: I1129 01:33:41.873958 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:33:42 crc kubenswrapper[4749]: I1129 01:33:42.248081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerStarted","Data":"1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896"} Nov 29 01:33:42 crc kubenswrapper[4749]: I1129 01:33:42.248308 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 01:33:42 crc kubenswrapper[4749]: I1129 01:33:42.257588 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8715ecba-f08b-4fcd-b129-d9e9c568e087","Type":"ContainerStarted","Data":"eced19490e815516b429f396da09f8236d99a0db8fbad5ff60fe95e770b29786"} Nov 29 01:33:42 crc kubenswrapper[4749]: I1129 01:33:42.257661 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8715ecba-f08b-4fcd-b129-d9e9c568e087","Type":"ContainerStarted","Data":"4967f27014785bf65a50fb6d24a7f2f85534c68587ab6453c12a25785023d064"} Nov 29 01:33:43 crc kubenswrapper[4749]: I1129 01:33:43.267659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8715ecba-f08b-4fcd-b129-d9e9c568e087","Type":"ContainerStarted","Data":"0957fdee810eb3eb8c31f26782750e24fed0ee07cdb3eb0ee23af55d2e009a5c"} Nov 29 01:33:43 crc kubenswrapper[4749]: I1129 01:33:43.300151 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.154383415 podStartE2EDuration="7.300136151s" podCreationTimestamp="2025-11-29 01:33:36 +0000 UTC" firstStartedPulling="2025-11-29 01:33:37.410786436 +0000 UTC m=+1360.582936293" lastFinishedPulling="2025-11-29 01:33:41.556539182 +0000 UTC m=+1364.728689029" observedRunningTime="2025-11-29 01:33:42.269087071 +0000 UTC m=+1365.441236948" watchObservedRunningTime="2025-11-29 01:33:43.300136151 +0000 UTC m=+1366.472286008" Nov 29 01:33:43 crc kubenswrapper[4749]: I1129 01:33:43.304866 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.304851959 podStartE2EDuration="3.304851959s" podCreationTimestamp="2025-11-29 01:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:43.294919051 +0000 UTC m=+1366.467068918" watchObservedRunningTime="2025-11-29 01:33:43.304851959 +0000 UTC m=+1366.477001816" Nov 29 01:33:44 crc kubenswrapper[4749]: I1129 01:33:44.274770 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 29 01:33:44 crc kubenswrapper[4749]: I1129 01:33:44.815416 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:33:44 crc kubenswrapper[4749]: I1129 01:33:44.902581 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 29 01:33:44 crc kubenswrapper[4749]: I1129 01:33:44.936382 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sv7q6"] Nov 29 01:33:44 crc kubenswrapper[4749]: I1129 01:33:44.938736 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" podUID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerName="dnsmasq-dns" containerID="cri-o://e1483d26072704b5d06655f4f2dd2df7fba170852801a8c9219385a870ef2482" gracePeriod=10 Nov 29 01:33:44 crc kubenswrapper[4749]: I1129 01:33:44.975842 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.320908 4749 generic.go:334] "Generic (PLEG): container finished" podID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerID="e1483d26072704b5d06655f4f2dd2df7fba170852801a8c9219385a870ef2482" exitCode=0 Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.321972 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" event={"ID":"fecf6c48-a0e6-425b-b071-d54cc01b6751","Type":"ContainerDied","Data":"e1483d26072704b5d06655f4f2dd2df7fba170852801a8c9219385a870ef2482"} Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.322368 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerName="cinder-scheduler" containerID="cri-o://ae0ed88a1c6160707e84533438149cec007bc4dee7f1326ad7db99e376e12201" gracePeriod=30 Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.322436 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerName="probe" containerID="cri-o://67b7b0e6a86da6f669181bbb9bcede7fecb0bbe4690fbdc042440d3d829e937f" gracePeriod=30 Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.481676 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.669295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-sb\") pod \"fecf6c48-a0e6-425b-b071-d54cc01b6751\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.669367 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-swift-storage-0\") pod \"fecf6c48-a0e6-425b-b071-d54cc01b6751\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.669476 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fszm7\" (UniqueName: \"kubernetes.io/projected/fecf6c48-a0e6-425b-b071-d54cc01b6751-kube-api-access-fszm7\") pod \"fecf6c48-a0e6-425b-b071-d54cc01b6751\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.669596 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-svc\") pod \"fecf6c48-a0e6-425b-b071-d54cc01b6751\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.669628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-nb\") pod \"fecf6c48-a0e6-425b-b071-d54cc01b6751\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.669668 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-config\") pod \"fecf6c48-a0e6-425b-b071-d54cc01b6751\" (UID: \"fecf6c48-a0e6-425b-b071-d54cc01b6751\") " Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.677500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fecf6c48-a0e6-425b-b071-d54cc01b6751-kube-api-access-fszm7" (OuterVolumeSpecName: "kube-api-access-fszm7") pod "fecf6c48-a0e6-425b-b071-d54cc01b6751" (UID: "fecf6c48-a0e6-425b-b071-d54cc01b6751"). InnerVolumeSpecName "kube-api-access-fszm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.732579 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fecf6c48-a0e6-425b-b071-d54cc01b6751" (UID: "fecf6c48-a0e6-425b-b071-d54cc01b6751"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.738841 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fecf6c48-a0e6-425b-b071-d54cc01b6751" (UID: "fecf6c48-a0e6-425b-b071-d54cc01b6751"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.748597 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fecf6c48-a0e6-425b-b071-d54cc01b6751" (UID: "fecf6c48-a0e6-425b-b071-d54cc01b6751"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.757258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-config" (OuterVolumeSpecName: "config") pod "fecf6c48-a0e6-425b-b071-d54cc01b6751" (UID: "fecf6c48-a0e6-425b-b071-d54cc01b6751"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.762651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fecf6c48-a0e6-425b-b071-d54cc01b6751" (UID: "fecf6c48-a0e6-425b-b071-d54cc01b6751"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.772864 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.772909 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.772924 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.772936 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.772947 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fecf6c48-a0e6-425b-b071-d54cc01b6751-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:45 crc kubenswrapper[4749]: I1129 01:33:45.772959 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fszm7\" (UniqueName: \"kubernetes.io/projected/fecf6c48-a0e6-425b-b071-d54cc01b6751-kube-api-access-fszm7\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:46 crc kubenswrapper[4749]: I1129 01:33:46.335718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" event={"ID":"fecf6c48-a0e6-425b-b071-d54cc01b6751","Type":"ContainerDied","Data":"5f0b0e68a52651d0d877ab8d1537036912f0fb9f8b9fee86fe5902b77a52b67f"} Nov 29 01:33:46 crc kubenswrapper[4749]: I1129 01:33:46.335778 4749 scope.go:117] "RemoveContainer" containerID="e1483d26072704b5d06655f4f2dd2df7fba170852801a8c9219385a870ef2482" Nov 29 01:33:46 crc kubenswrapper[4749]: I1129 01:33:46.338506 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" Nov 29 01:33:46 crc kubenswrapper[4749]: I1129 01:33:46.339016 4749 generic.go:334] "Generic (PLEG): container finished" podID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerID="67b7b0e6a86da6f669181bbb9bcede7fecb0bbe4690fbdc042440d3d829e937f" exitCode=0 Nov 29 01:33:46 crc kubenswrapper[4749]: I1129 01:33:46.339077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae","Type":"ContainerDied","Data":"67b7b0e6a86da6f669181bbb9bcede7fecb0bbe4690fbdc042440d3d829e937f"} Nov 29 01:33:46 crc kubenswrapper[4749]: I1129 01:33:46.380177 4749 scope.go:117] "RemoveContainer" containerID="0d5a589cfcaf2f0a639e6ed588c4e37b1712179544ef1a7769bab8a1ab5cd856" Nov 29 01:33:46 crc kubenswrapper[4749]: I1129 01:33:46.384755 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sv7q6"] Nov 29 01:33:46 crc kubenswrapper[4749]: I1129 01:33:46.398284 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-sv7q6"] Nov 29 01:33:47 crc kubenswrapper[4749]: I1129 01:33:47.094086 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecf6c48-a0e6-425b-b071-d54cc01b6751" path="/var/lib/kubelet/pods/fecf6c48-a0e6-425b-b071-d54cc01b6751/volumes" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.383958 4749 generic.go:334] "Generic (PLEG): container finished" podID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerID="ae0ed88a1c6160707e84533438149cec007bc4dee7f1326ad7db99e376e12201" exitCode=0 Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.384097 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae","Type":"ContainerDied","Data":"ae0ed88a1c6160707e84533438149cec007bc4dee7f1326ad7db99e376e12201"} Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.715470 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.865586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data-custom\") pod \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.865771 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data\") pod \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.865847 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-scripts\") pod \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.866022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x4sf\" (UniqueName: \"kubernetes.io/projected/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-kube-api-access-9x4sf\") pod \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.866156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-combined-ca-bundle\") pod \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.866238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-etc-machine-id\") pod \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\" (UID: \"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae\") " Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.866396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" (UID: "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.867131 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.877330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-kube-api-access-9x4sf" (OuterVolumeSpecName: "kube-api-access-9x4sf") pod "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" (UID: "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae"). InnerVolumeSpecName "kube-api-access-9x4sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.894455 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-scripts" (OuterVolumeSpecName: "scripts") pod "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" (UID: "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.894809 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" (UID: "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.956788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" (UID: "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.968776 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.968810 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x4sf\" (UniqueName: \"kubernetes.io/projected/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-kube-api-access-9x4sf\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.968825 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:49 crc kubenswrapper[4749]: I1129 01:33:49.968837 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.053297 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data" (OuterVolumeSpecName: "config-data") pod "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" (UID: "e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.070760 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.274687 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85ff748b95-sv7q6" podUID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: i/o timeout" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.405116 4749 generic.go:334] "Generic (PLEG): container finished" podID="436900b7-4647-44d8-8f0f-6e6077747800" containerID="92274068016fa71ba48799703446fbcf030ea448fcc91ca833ddc199876bd97f" exitCode=0 Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.405249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cf68bf7b-r88sb" event={"ID":"436900b7-4647-44d8-8f0f-6e6077747800","Type":"ContainerDied","Data":"92274068016fa71ba48799703446fbcf030ea448fcc91ca833ddc199876bd97f"} Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.409982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae","Type":"ContainerDied","Data":"0eff4e655a00dd52c78698d9c211611f8a703de29c32131baa460e14a0607551"} Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.410025 4749 scope.go:117] "RemoveContainer" containerID="67b7b0e6a86da6f669181bbb9bcede7fecb0bbe4690fbdc042440d3d829e937f" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.410093 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.449801 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.455354 4749 scope.go:117] "RemoveContainer" containerID="ae0ed88a1c6160707e84533438149cec007bc4dee7f1326ad7db99e376e12201" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.456921 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.478814 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:50 crc kubenswrapper[4749]: E1129 01:33:50.479384 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerName="probe" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.479449 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerName="probe" Nov 29 01:33:50 crc kubenswrapper[4749]: E1129 01:33:50.479512 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerName="cinder-scheduler" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.479561 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerName="cinder-scheduler" Nov 29 01:33:50 crc kubenswrapper[4749]: E1129 01:33:50.479638 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerName="dnsmasq-dns" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.479687 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerName="dnsmasq-dns" Nov 29 01:33:50 crc kubenswrapper[4749]: E1129 01:33:50.479737 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerName="init" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.479793 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerName="init" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.480009 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecf6c48-a0e6-425b-b071-d54cc01b6751" containerName="dnsmasq-dns" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.480073 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerName="probe" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.480138 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" containerName="cinder-scheduler" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.481103 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.484036 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.497658 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.583589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.583673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-scripts\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.583723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.583749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.583771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.583823 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8j7m\" (UniqueName: \"kubernetes.io/projected/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-kube-api-access-c8j7m\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.684903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.685186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-scripts\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.685836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.685873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.685918 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.685920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.686073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8j7m\" (UniqueName: \"kubernetes.io/projected/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-kube-api-access-c8j7m\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.693816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-scripts\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.698369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.699798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.710390 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.716713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8j7m\" (UniqueName: \"kubernetes.io/projected/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-kube-api-access-c8j7m\") pod \"cinder-scheduler-0\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.798523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.846829 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.990952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-combined-ca-bundle\") pod \"436900b7-4647-44d8-8f0f-6e6077747800\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.991248 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-config\") pod \"436900b7-4647-44d8-8f0f-6e6077747800\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.991395 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfjts\" (UniqueName: \"kubernetes.io/projected/436900b7-4647-44d8-8f0f-6e6077747800-kube-api-access-nfjts\") pod \"436900b7-4647-44d8-8f0f-6e6077747800\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.991440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-httpd-config\") pod \"436900b7-4647-44d8-8f0f-6e6077747800\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " Nov 29 01:33:50 crc kubenswrapper[4749]: I1129 01:33:50.991489 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-ovndb-tls-certs\") pod \"436900b7-4647-44d8-8f0f-6e6077747800\" (UID: \"436900b7-4647-44d8-8f0f-6e6077747800\") " Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:50.999362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436900b7-4647-44d8-8f0f-6e6077747800-kube-api-access-nfjts" (OuterVolumeSpecName: "kube-api-access-nfjts") pod "436900b7-4647-44d8-8f0f-6e6077747800" (UID: "436900b7-4647-44d8-8f0f-6e6077747800"). InnerVolumeSpecName "kube-api-access-nfjts". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.002315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "436900b7-4647-44d8-8f0f-6e6077747800" (UID: "436900b7-4647-44d8-8f0f-6e6077747800"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.056801 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-config" (OuterVolumeSpecName: "config") pod "436900b7-4647-44d8-8f0f-6e6077747800" (UID: "436900b7-4647-44d8-8f0f-6e6077747800"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.068350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "436900b7-4647-44d8-8f0f-6e6077747800" (UID: "436900b7-4647-44d8-8f0f-6e6077747800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.075709 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "436900b7-4647-44d8-8f0f-6e6077747800" (UID: "436900b7-4647-44d8-8f0f-6e6077747800"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.089271 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae" path="/var/lib/kubelet/pods/e20f40c0-d97f-4cf4-9eec-ca0472d0d3ae/volumes" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.093834 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.093877 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfjts\" (UniqueName: \"kubernetes.io/projected/436900b7-4647-44d8-8f0f-6e6077747800-kube-api-access-nfjts\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.093892 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.093904 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.093916 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436900b7-4647-44d8-8f0f-6e6077747800-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.286664 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.303035 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.439141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f","Type":"ContainerStarted","Data":"906f7a1c77acb0c1c05f8358dd94558118893ab2ab4fa0d59651e9cfe0745337"} Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.455082 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79cf68bf7b-r88sb" event={"ID":"436900b7-4647-44d8-8f0f-6e6077747800","Type":"ContainerDied","Data":"5ac1432c9f6f89391100108cd3e553c15c6f329d150aa4b6283f64a7b8917600"} Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.455136 4749 scope.go:117] "RemoveContainer" containerID="378ad2fd5a8829a92d3577ec6ec91a92b3e091baf09191f73f6dd468ff370b3c" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.455407 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79cf68bf7b-r88sb" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.512511 4749 scope.go:117] "RemoveContainer" containerID="92274068016fa71ba48799703446fbcf030ea448fcc91ca833ddc199876bd97f" Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.522130 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79cf68bf7b-r88sb"] Nov 29 01:33:51 crc kubenswrapper[4749]: I1129 01:33:51.534219 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79cf68bf7b-r88sb"] Nov 29 01:33:52 crc kubenswrapper[4749]: I1129 01:33:52.471308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f","Type":"ContainerStarted","Data":"f743ba6ff8b1da63b560049f4030c6a63de306ec90e8671821fcc72e7725c52a"} Nov 29 01:33:52 crc kubenswrapper[4749]: I1129 01:33:52.938714 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 29 01:33:53 crc kubenswrapper[4749]: I1129 01:33:53.085875 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="436900b7-4647-44d8-8f0f-6e6077747800" path="/var/lib/kubelet/pods/436900b7-4647-44d8-8f0f-6e6077747800/volumes" Nov 29 01:33:53 crc kubenswrapper[4749]: I1129 01:33:53.480678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f","Type":"ContainerStarted","Data":"5ed421f49bc336804d73a5ee8923bd091bdb89ba7051efb3930f5c62c2dfef03"} Nov 29 01:33:53 crc kubenswrapper[4749]: I1129 01:33:53.504397 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.5043779539999997 podStartE2EDuration="3.504377954s" podCreationTimestamp="2025-11-29 01:33:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:53.503712528 +0000 UTC m=+1376.675862405" watchObservedRunningTime="2025-11-29 01:33:53.504377954 +0000 UTC m=+1376.676527811" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.762834 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 29 01:33:55 crc kubenswrapper[4749]: E1129 01:33:55.763545 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436900b7-4647-44d8-8f0f-6e6077747800" containerName="neutron-api" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.763562 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="436900b7-4647-44d8-8f0f-6e6077747800" containerName="neutron-api" Nov 29 01:33:55 crc kubenswrapper[4749]: E1129 01:33:55.763608 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="436900b7-4647-44d8-8f0f-6e6077747800" containerName="neutron-httpd" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.763618 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="436900b7-4647-44d8-8f0f-6e6077747800" containerName="neutron-httpd" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.763841 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="436900b7-4647-44d8-8f0f-6e6077747800" containerName="neutron-api" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.763870 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="436900b7-4647-44d8-8f0f-6e6077747800" containerName="neutron-httpd" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.764620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.767693 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.767947 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.775734 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-sxng5" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.788149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.799282 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.897006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.897079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg7rs\" (UniqueName: \"kubernetes.io/projected/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-kube-api-access-fg7rs\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.897281 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config-secret\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.897387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.984095 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-84b776bb8c-llx6x"] Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.986220 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.988075 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.988764 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.989020 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.999736 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config-secret\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.999834 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:55 crc kubenswrapper[4749]: I1129 01:33:55.999964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.000020 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg7rs\" (UniqueName: \"kubernetes.io/projected/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-kube-api-access-fg7rs\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.001568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.003589 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-84b776bb8c-llx6x"] Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.008045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.009551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config-secret\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.021075 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.021371 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="ceilometer-central-agent" containerID="cri-o://d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce" gracePeriod=30 Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.021596 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="proxy-httpd" containerID="cri-o://1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896" gracePeriod=30 Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.021618 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="ceilometer-notification-agent" containerID="cri-o://8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff" gracePeriod=30 Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.021753 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="sg-core" containerID="cri-o://0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1" gracePeriod=30 Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.037741 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg7rs\" (UniqueName: \"kubernetes.io/projected/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-kube-api-access-fg7rs\") pod \"openstackclient\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.054690 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.159:3000/\": EOF" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.088178 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.105352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-public-tls-certs\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.105414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-config-data\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.105526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqs4c\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-kube-api-access-bqs4c\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.105553 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-combined-ca-bundle\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.105574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-internal-tls-certs\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.105594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-log-httpd\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.105618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-run-httpd\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.105644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-etc-swift\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.133799 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.160414 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.178631 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.180531 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.197497 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqs4c\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-kube-api-access-bqs4c\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-combined-ca-bundle\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-internal-tls-certs\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-log-httpd\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-run-httpd\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207322 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-etc-swift\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-public-tls-certs\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-config-data\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.207908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-log-httpd\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.208520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-run-httpd\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.212726 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-public-tls-certs\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.212764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-internal-tls-certs\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.213882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-etc-swift\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.215896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-combined-ca-bundle\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.223160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-config-data\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.224071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqs4c\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-kube-api-access-bqs4c\") pod \"swift-proxy-84b776bb8c-llx6x\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: E1129 01:33:56.257786 4749 log.go:32] "RunPodSandbox from runtime service failed" err=< Nov 29 01:33:56 crc kubenswrapper[4749]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ec326d6a-dbe9-4d09-9b16-1a1698ff297f_0(2050a89bf1fb5a35a54cc629d38a66f7919f1f48f1ed0f59044209a91656cfa6): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2050a89bf1fb5a35a54cc629d38a66f7919f1f48f1ed0f59044209a91656cfa6" Netns:"/var/run/netns/3fad8af8-ecc2-4a22-9fcf-5f0f9aba974c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=2050a89bf1fb5a35a54cc629d38a66f7919f1f48f1ed0f59044209a91656cfa6;K8S_POD_UID=ec326d6a-dbe9-4d09-9b16-1a1698ff297f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ec326d6a-dbe9-4d09-9b16-1a1698ff297f]: expected pod UID "ec326d6a-dbe9-4d09-9b16-1a1698ff297f" but got "5b102261-e51d-4b92-a267-167f1ffd0a41" from Kube API Nov 29 01:33:56 crc kubenswrapper[4749]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Nov 29 01:33:56 crc kubenswrapper[4749]: > Nov 29 01:33:56 crc kubenswrapper[4749]: E1129 01:33:56.257860 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Nov 29 01:33:56 crc kubenswrapper[4749]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ec326d6a-dbe9-4d09-9b16-1a1698ff297f_0(2050a89bf1fb5a35a54cc629d38a66f7919f1f48f1ed0f59044209a91656cfa6): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2050a89bf1fb5a35a54cc629d38a66f7919f1f48f1ed0f59044209a91656cfa6" Netns:"/var/run/netns/3fad8af8-ecc2-4a22-9fcf-5f0f9aba974c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=2050a89bf1fb5a35a54cc629d38a66f7919f1f48f1ed0f59044209a91656cfa6;K8S_POD_UID=ec326d6a-dbe9-4d09-9b16-1a1698ff297f" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ec326d6a-dbe9-4d09-9b16-1a1698ff297f]: expected pod UID "ec326d6a-dbe9-4d09-9b16-1a1698ff297f" but got "5b102261-e51d-4b92-a267-167f1ffd0a41" from Kube API Nov 29 01:33:56 crc kubenswrapper[4749]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Nov 29 01:33:56 crc kubenswrapper[4749]: > pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.308831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvtq\" (UniqueName: \"kubernetes.io/projected/5b102261-e51d-4b92-a267-167f1ffd0a41-kube-api-access-xgvtq\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.308959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.309209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.309269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.405179 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.411447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvtq\" (UniqueName: \"kubernetes.io/projected/5b102261-e51d-4b92-a267-167f1ffd0a41-kube-api-access-xgvtq\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.411526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.411591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.411616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.412409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.415398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config-secret\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.415485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.428501 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvtq\" (UniqueName: \"kubernetes.io/projected/5b102261-e51d-4b92-a267-167f1ffd0a41-kube-api-access-xgvtq\") pod \"openstackclient\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.530717 4749 generic.go:334] "Generic (PLEG): container finished" podID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerID="1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896" exitCode=0 Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.531130 4749 generic.go:334] "Generic (PLEG): container finished" podID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerID="0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1" exitCode=2 Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.531143 4749 generic.go:334] "Generic (PLEG): container finished" podID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerID="d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce" exitCode=0 Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.530803 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerDied","Data":"1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896"} Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.531296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerDied","Data":"0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1"} Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.531319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerDied","Data":"d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce"} Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.531486 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.534601 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ec326d6a-dbe9-4d09-9b16-1a1698ff297f" podUID="5b102261-e51d-4b92-a267-167f1ffd0a41" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.545231 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.593001 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.615907 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config-secret\") pod \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.615960 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-combined-ca-bundle\") pod \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.616019 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg7rs\" (UniqueName: \"kubernetes.io/projected/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-kube-api-access-fg7rs\") pod \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.616088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config\") pod \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\" (UID: \"ec326d6a-dbe9-4d09-9b16-1a1698ff297f\") " Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.616819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ec326d6a-dbe9-4d09-9b16-1a1698ff297f" (UID: "ec326d6a-dbe9-4d09-9b16-1a1698ff297f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.622478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-kube-api-access-fg7rs" (OuterVolumeSpecName: "kube-api-access-fg7rs") pod "ec326d6a-dbe9-4d09-9b16-1a1698ff297f" (UID: "ec326d6a-dbe9-4d09-9b16-1a1698ff297f"). InnerVolumeSpecName "kube-api-access-fg7rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.622548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec326d6a-dbe9-4d09-9b16-1a1698ff297f" (UID: "ec326d6a-dbe9-4d09-9b16-1a1698ff297f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.622593 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ec326d6a-dbe9-4d09-9b16-1a1698ff297f" (UID: "ec326d6a-dbe9-4d09-9b16-1a1698ff297f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.718514 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.718614 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.718627 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg7rs\" (UniqueName: \"kubernetes.io/projected/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-kube-api-access-fg7rs\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.718639 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ec326d6a-dbe9-4d09-9b16-1a1698ff297f-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:33:56 crc kubenswrapper[4749]: W1129 01:33:56.944503 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76910f08_d491_4b48_9439_78baad6ac3d3.slice/crio-bb70d12fcb9e5fb42dda77e67b8bad95433e6a3632c0ea01f446519d352d2834 WatchSource:0}: Error finding container bb70d12fcb9e5fb42dda77e67b8bad95433e6a3632c0ea01f446519d352d2834: Status 404 returned error can't find the container with id bb70d12fcb9e5fb42dda77e67b8bad95433e6a3632c0ea01f446519d352d2834 Nov 29 01:33:56 crc kubenswrapper[4749]: I1129 01:33:56.945647 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-84b776bb8c-llx6x"] Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.053969 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 01:33:57 crc kubenswrapper[4749]: W1129 01:33:57.060398 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b102261_e51d_4b92_a267_167f1ffd0a41.slice/crio-dc55a2b8f18ed7b3e1863e6ab31e4a8b69128755d4efe7abcdd9778944a9153f WatchSource:0}: Error finding container dc55a2b8f18ed7b3e1863e6ab31e4a8b69128755d4efe7abcdd9778944a9153f: Status 404 returned error can't find the container with id dc55a2b8f18ed7b3e1863e6ab31e4a8b69128755d4efe7abcdd9778944a9153f Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.113509 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec326d6a-dbe9-4d09-9b16-1a1698ff297f" path="/var/lib/kubelet/pods/ec326d6a-dbe9-4d09-9b16-1a1698ff297f/volumes" Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.550536 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84b776bb8c-llx6x" event={"ID":"76910f08-d491-4b48-9439-78baad6ac3d3","Type":"ContainerStarted","Data":"b63a6f1d3dc68fbb3a1047eaf37bb7a13d0fbf9d362ca277af916b69381c4f06"} Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.550985 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.551004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84b776bb8c-llx6x" event={"ID":"76910f08-d491-4b48-9439-78baad6ac3d3","Type":"ContainerStarted","Data":"3456069d4e6f0aebd8654c15b636090a841895914222fd2b0f04601cdc235e31"} Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.551019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84b776bb8c-llx6x" event={"ID":"76910f08-d491-4b48-9439-78baad6ac3d3","Type":"ContainerStarted","Data":"bb70d12fcb9e5fb42dda77e67b8bad95433e6a3632c0ea01f446519d352d2834"} Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.551051 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.552276 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.552265 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5b102261-e51d-4b92-a267-167f1ffd0a41","Type":"ContainerStarted","Data":"dc55a2b8f18ed7b3e1863e6ab31e4a8b69128755d4efe7abcdd9778944a9153f"} Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.576459 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ec326d6a-dbe9-4d09-9b16-1a1698ff297f" podUID="5b102261-e51d-4b92-a267-167f1ffd0a41" Nov 29 01:33:57 crc kubenswrapper[4749]: I1129 01:33:57.579788 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-84b776bb8c-llx6x" podStartSLOduration=2.579767705 podStartE2EDuration="2.579767705s" podCreationTimestamp="2025-11-29 01:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:33:57.569819136 +0000 UTC m=+1380.741969013" watchObservedRunningTime="2025-11-29 01:33:57.579767705 +0000 UTC m=+1380.751917562" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.315057 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.493395 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-log-httpd\") pod \"f1055f93-b59a-482b-86f7-d6109aa96abe\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.493507 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-sg-core-conf-yaml\") pod \"f1055f93-b59a-482b-86f7-d6109aa96abe\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.493588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-run-httpd\") pod \"f1055f93-b59a-482b-86f7-d6109aa96abe\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.493810 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-config-data\") pod \"f1055f93-b59a-482b-86f7-d6109aa96abe\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.493872 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-combined-ca-bundle\") pod \"f1055f93-b59a-482b-86f7-d6109aa96abe\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.493917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-scripts\") pod \"f1055f93-b59a-482b-86f7-d6109aa96abe\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.493964 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg87q\" (UniqueName: \"kubernetes.io/projected/f1055f93-b59a-482b-86f7-d6109aa96abe-kube-api-access-fg87q\") pod \"f1055f93-b59a-482b-86f7-d6109aa96abe\" (UID: \"f1055f93-b59a-482b-86f7-d6109aa96abe\") " Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.496615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f1055f93-b59a-482b-86f7-d6109aa96abe" (UID: "f1055f93-b59a-482b-86f7-d6109aa96abe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.496708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f1055f93-b59a-482b-86f7-d6109aa96abe" (UID: "f1055f93-b59a-482b-86f7-d6109aa96abe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.501397 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-scripts" (OuterVolumeSpecName: "scripts") pod "f1055f93-b59a-482b-86f7-d6109aa96abe" (UID: "f1055f93-b59a-482b-86f7-d6109aa96abe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.511417 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1055f93-b59a-482b-86f7-d6109aa96abe-kube-api-access-fg87q" (OuterVolumeSpecName: "kube-api-access-fg87q") pod "f1055f93-b59a-482b-86f7-d6109aa96abe" (UID: "f1055f93-b59a-482b-86f7-d6109aa96abe"). InnerVolumeSpecName "kube-api-access-fg87q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.524123 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f1055f93-b59a-482b-86f7-d6109aa96abe" (UID: "f1055f93-b59a-482b-86f7-d6109aa96abe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.583889 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1055f93-b59a-482b-86f7-d6109aa96abe" (UID: "f1055f93-b59a-482b-86f7-d6109aa96abe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.587703 4749 generic.go:334] "Generic (PLEG): container finished" podID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerID="8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff" exitCode=0 Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.587755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerDied","Data":"8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff"} Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.587796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f1055f93-b59a-482b-86f7-d6109aa96abe","Type":"ContainerDied","Data":"ca4083126ea2ef9365a012ae85da1e24a4bd62db04b2266ea14beab6316e936c"} Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.587828 4749 scope.go:117] "RemoveContainer" containerID="1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.588013 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.596537 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.596570 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.596580 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg87q\" (UniqueName: \"kubernetes.io/projected/f1055f93-b59a-482b-86f7-d6109aa96abe-kube-api-access-fg87q\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.596600 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.596609 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.596618 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f1055f93-b59a-482b-86f7-d6109aa96abe-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.634574 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-config-data" (OuterVolumeSpecName: "config-data") pod "f1055f93-b59a-482b-86f7-d6109aa96abe" (UID: "f1055f93-b59a-482b-86f7-d6109aa96abe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.639051 4749 scope.go:117] "RemoveContainer" containerID="0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.672416 4749 scope.go:117] "RemoveContainer" containerID="8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.695393 4749 scope.go:117] "RemoveContainer" containerID="d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.699373 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1055f93-b59a-482b-86f7-d6109aa96abe-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.716579 4749 scope.go:117] "RemoveContainer" containerID="1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896" Nov 29 01:34:00 crc kubenswrapper[4749]: E1129 01:34:00.717010 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896\": container with ID starting with 1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896 not found: ID does not exist" containerID="1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.717053 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896"} err="failed to get container status \"1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896\": rpc error: code = NotFound desc = could not find container \"1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896\": container with ID starting with 1f80f612b19631e4a521a4bff4d3c8fd881009af6654388db2e79b61b60d0896 not found: ID does not exist" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.717080 4749 scope.go:117] "RemoveContainer" containerID="0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1" Nov 29 01:34:00 crc kubenswrapper[4749]: E1129 01:34:00.717515 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1\": container with ID starting with 0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1 not found: ID does not exist" containerID="0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.717547 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1"} err="failed to get container status \"0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1\": rpc error: code = NotFound desc = could not find container \"0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1\": container with ID starting with 0ca3c8433c9931ece16c3a95ce23f1bd9c215549b591981da4b4d2652cff27c1 not found: ID does not exist" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.717571 4749 scope.go:117] "RemoveContainer" containerID="8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff" Nov 29 01:34:00 crc kubenswrapper[4749]: E1129 01:34:00.717944 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff\": container with ID starting with 8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff not found: ID does not exist" containerID="8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.717965 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff"} err="failed to get container status \"8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff\": rpc error: code = NotFound desc = could not find container \"8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff\": container with ID starting with 8dcf45122a684f91cf2a391951ede0dfb57e5d08e2b8908c56a53b1211f62cff not found: ID does not exist" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.717977 4749 scope.go:117] "RemoveContainer" containerID="d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce" Nov 29 01:34:00 crc kubenswrapper[4749]: E1129 01:34:00.718358 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce\": container with ID starting with d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce not found: ID does not exist" containerID="d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.718384 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce"} err="failed to get container status \"d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce\": rpc error: code = NotFound desc = could not find container \"d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce\": container with ID starting with d61a28f0e90b4e04aa4d7a5fd3f2cdce9673b556453d61e6451cafa9014674ce not found: ID does not exist" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.954623 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.975184 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987085 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:00 crc kubenswrapper[4749]: E1129 01:34:00.987431 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="sg-core" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987448 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="sg-core" Nov 29 01:34:00 crc kubenswrapper[4749]: E1129 01:34:00.987462 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="proxy-httpd" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987469 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="proxy-httpd" Nov 29 01:34:00 crc kubenswrapper[4749]: E1129 01:34:00.987483 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="ceilometer-central-agent" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987489 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="ceilometer-central-agent" Nov 29 01:34:00 crc kubenswrapper[4749]: E1129 01:34:00.987503 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="ceilometer-notification-agent" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987509 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="ceilometer-notification-agent" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987678 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="proxy-httpd" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987695 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="ceilometer-notification-agent" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987714 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="ceilometer-central-agent" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.987726 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" containerName="sg-core" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.989230 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.993948 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.994238 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 01:34:00 crc kubenswrapper[4749]: I1129 01:34:00.999019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.053651 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.100583 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1055f93-b59a-482b-86f7-d6109aa96abe" path="/var/lib/kubelet/pods/f1055f93-b59a-482b-86f7-d6109aa96abe/volumes" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.107356 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmrjd\" (UniqueName: \"kubernetes.io/projected/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-kube-api-access-hmrjd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.107483 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.107506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.107542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-scripts\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.107561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-config-data\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.107624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.107664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: E1129 01:34:01.166679 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1055f93_b59a_482b_86f7_d6109aa96abe.slice\": RecentStats: unable to find data in memory cache]" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.208882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.208946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmrjd\" (UniqueName: \"kubernetes.io/projected/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-kube-api-access-hmrjd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.209045 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.209086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.209104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-scripts\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.209131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-config-data\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.209165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.210917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.211010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.217618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-scripts\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.222083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-config-data\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.226441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.228826 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.229639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmrjd\" (UniqueName: \"kubernetes.io/projected/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-kube-api-access-hmrjd\") pod \"ceilometer-0\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " pod="openstack/ceilometer-0" Nov 29 01:34:01 crc kubenswrapper[4749]: I1129 01:34:01.344568 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:04 crc kubenswrapper[4749]: I1129 01:34:04.401283 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:06 crc kubenswrapper[4749]: I1129 01:34:06.469925 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:34:06 crc kubenswrapper[4749]: I1129 01:34:06.476626 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.660283 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:07 crc kubenswrapper[4749]: W1129 01:34:07.667153 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3fb6c0c_0956_4000_94ad_6c62f8da8b5a.slice/crio-4ec5ea0b7ee66425be59dda430d1dbe6b0fbbd4fcf4548aec6399f006250d63b WatchSource:0}: Error finding container 4ec5ea0b7ee66425be59dda430d1dbe6b0fbbd4fcf4548aec6399f006250d63b: Status 404 returned error can't find the container with id 4ec5ea0b7ee66425be59dda430d1dbe6b0fbbd4fcf4548aec6399f006250d63b Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.669560 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.674676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5b102261-e51d-4b92-a267-167f1ffd0a41","Type":"ContainerStarted","Data":"b48098c5e4956d09db8ec03faedfc1ef2c22fcc5a22369e6385279648a81cfce"} Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.710873 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.5545339249999999 podStartE2EDuration="11.710846082s" podCreationTimestamp="2025-11-29 01:33:56 +0000 UTC" firstStartedPulling="2025-11-29 01:33:57.062594363 +0000 UTC m=+1380.234744220" lastFinishedPulling="2025-11-29 01:34:07.21890652 +0000 UTC m=+1390.391056377" observedRunningTime="2025-11-29 01:34:07.701080018 +0000 UTC m=+1390.873229885" watchObservedRunningTime="2025-11-29 01:34:07.710846082 +0000 UTC m=+1390.882995939" Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.837991 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-rn8bz"] Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.839682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.851541 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rn8bz"] Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.933375 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-c9grk"] Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.935138 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.952966 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3ebb-account-create-update-nltbl"] Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.954109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.966416 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zrl4\" (UniqueName: \"kubernetes.io/projected/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-kube-api-access-5zrl4\") pod \"nova-api-db-create-rn8bz\" (UID: \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\") " pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.966506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-operator-scripts\") pod \"nova-api-db-create-rn8bz\" (UID: \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\") " pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.966908 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.973860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-c9grk"] Nov 29 01:34:07 crc kubenswrapper[4749]: I1129 01:34:07.980889 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3ebb-account-create-update-nltbl"] Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.068905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5h5m\" (UniqueName: \"kubernetes.io/projected/edfd4dba-036b-469a-bd29-185017dbfa55-kube-api-access-w5h5m\") pod \"nova-api-3ebb-account-create-update-nltbl\" (UID: \"edfd4dba-036b-469a-bd29-185017dbfa55\") " pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.068965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6fz\" (UniqueName: \"kubernetes.io/projected/39408fef-9da8-4fde-b501-421748576739-kube-api-access-kx6fz\") pod \"nova-cell0-db-create-c9grk\" (UID: \"39408fef-9da8-4fde-b501-421748576739\") " pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.068995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edfd4dba-036b-469a-bd29-185017dbfa55-operator-scripts\") pod \"nova-api-3ebb-account-create-update-nltbl\" (UID: \"edfd4dba-036b-469a-bd29-185017dbfa55\") " pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.069044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39408fef-9da8-4fde-b501-421748576739-operator-scripts\") pod \"nova-cell0-db-create-c9grk\" (UID: \"39408fef-9da8-4fde-b501-421748576739\") " pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.069116 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zrl4\" (UniqueName: \"kubernetes.io/projected/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-kube-api-access-5zrl4\") pod \"nova-api-db-create-rn8bz\" (UID: \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\") " pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.069185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-operator-scripts\") pod \"nova-api-db-create-rn8bz\" (UID: \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\") " pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.071390 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-operator-scripts\") pod \"nova-api-db-create-rn8bz\" (UID: \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\") " pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.087696 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-cbs5l"] Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.089052 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.091843 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zrl4\" (UniqueName: \"kubernetes.io/projected/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-kube-api-access-5zrl4\") pod \"nova-api-db-create-rn8bz\" (UID: \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\") " pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.106124 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cbs5l"] Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.145775 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8947-account-create-update-n8bjz"] Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.147180 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.151424 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.155017 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8947-account-create-update-n8bjz"] Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.161020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.170246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5h5m\" (UniqueName: \"kubernetes.io/projected/edfd4dba-036b-469a-bd29-185017dbfa55-kube-api-access-w5h5m\") pod \"nova-api-3ebb-account-create-update-nltbl\" (UID: \"edfd4dba-036b-469a-bd29-185017dbfa55\") " pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.170281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6fz\" (UniqueName: \"kubernetes.io/projected/39408fef-9da8-4fde-b501-421748576739-kube-api-access-kx6fz\") pod \"nova-cell0-db-create-c9grk\" (UID: \"39408fef-9da8-4fde-b501-421748576739\") " pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.170307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edfd4dba-036b-469a-bd29-185017dbfa55-operator-scripts\") pod \"nova-api-3ebb-account-create-update-nltbl\" (UID: \"edfd4dba-036b-469a-bd29-185017dbfa55\") " pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.170352 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39408fef-9da8-4fde-b501-421748576739-operator-scripts\") pod \"nova-cell0-db-create-c9grk\" (UID: \"39408fef-9da8-4fde-b501-421748576739\") " pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.171920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39408fef-9da8-4fde-b501-421748576739-operator-scripts\") pod \"nova-cell0-db-create-c9grk\" (UID: \"39408fef-9da8-4fde-b501-421748576739\") " pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.172189 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edfd4dba-036b-469a-bd29-185017dbfa55-operator-scripts\") pod \"nova-api-3ebb-account-create-update-nltbl\" (UID: \"edfd4dba-036b-469a-bd29-185017dbfa55\") " pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.189705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5h5m\" (UniqueName: \"kubernetes.io/projected/edfd4dba-036b-469a-bd29-185017dbfa55-kube-api-access-w5h5m\") pod \"nova-api-3ebb-account-create-update-nltbl\" (UID: \"edfd4dba-036b-469a-bd29-185017dbfa55\") " pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.196908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6fz\" (UniqueName: \"kubernetes.io/projected/39408fef-9da8-4fde-b501-421748576739-kube-api-access-kx6fz\") pod \"nova-cell0-db-create-c9grk\" (UID: \"39408fef-9da8-4fde-b501-421748576739\") " pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.263012 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.272342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60229df5-f068-4b05-ba77-f02234f912f7-operator-scripts\") pod \"nova-cell0-8947-account-create-update-n8bjz\" (UID: \"60229df5-f068-4b05-ba77-f02234f912f7\") " pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.272454 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l4sj\" (UniqueName: \"kubernetes.io/projected/60229df5-f068-4b05-ba77-f02234f912f7-kube-api-access-2l4sj\") pod \"nova-cell0-8947-account-create-update-n8bjz\" (UID: \"60229df5-f068-4b05-ba77-f02234f912f7\") " pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.272563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400b5586-62b7-420f-86ea-67d92778f6f2-operator-scripts\") pod \"nova-cell1-db-create-cbs5l\" (UID: \"400b5586-62b7-420f-86ea-67d92778f6f2\") " pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.272604 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7wng\" (UniqueName: \"kubernetes.io/projected/400b5586-62b7-420f-86ea-67d92778f6f2-kube-api-access-g7wng\") pod \"nova-cell1-db-create-cbs5l\" (UID: \"400b5586-62b7-420f-86ea-67d92778f6f2\") " pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.282042 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.366317 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-089c-account-create-update-cxbw6"] Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.368124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.374817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400b5586-62b7-420f-86ea-67d92778f6f2-operator-scripts\") pod \"nova-cell1-db-create-cbs5l\" (UID: \"400b5586-62b7-420f-86ea-67d92778f6f2\") " pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.374903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7wng\" (UniqueName: \"kubernetes.io/projected/400b5586-62b7-420f-86ea-67d92778f6f2-kube-api-access-g7wng\") pod \"nova-cell1-db-create-cbs5l\" (UID: \"400b5586-62b7-420f-86ea-67d92778f6f2\") " pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.374967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60229df5-f068-4b05-ba77-f02234f912f7-operator-scripts\") pod \"nova-cell0-8947-account-create-update-n8bjz\" (UID: \"60229df5-f068-4b05-ba77-f02234f912f7\") " pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.375031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l4sj\" (UniqueName: \"kubernetes.io/projected/60229df5-f068-4b05-ba77-f02234f912f7-kube-api-access-2l4sj\") pod \"nova-cell0-8947-account-create-update-n8bjz\" (UID: \"60229df5-f068-4b05-ba77-f02234f912f7\") " pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.375893 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.375901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60229df5-f068-4b05-ba77-f02234f912f7-operator-scripts\") pod \"nova-cell0-8947-account-create-update-n8bjz\" (UID: \"60229df5-f068-4b05-ba77-f02234f912f7\") " pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.376416 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400b5586-62b7-420f-86ea-67d92778f6f2-operator-scripts\") pod \"nova-cell1-db-create-cbs5l\" (UID: \"400b5586-62b7-420f-86ea-67d92778f6f2\") " pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.377501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-089c-account-create-update-cxbw6"] Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.399155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7wng\" (UniqueName: \"kubernetes.io/projected/400b5586-62b7-420f-86ea-67d92778f6f2-kube-api-access-g7wng\") pod \"nova-cell1-db-create-cbs5l\" (UID: \"400b5586-62b7-420f-86ea-67d92778f6f2\") " pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.401228 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l4sj\" (UniqueName: \"kubernetes.io/projected/60229df5-f068-4b05-ba77-f02234f912f7-kube-api-access-2l4sj\") pod \"nova-cell0-8947-account-create-update-n8bjz\" (UID: \"60229df5-f068-4b05-ba77-f02234f912f7\") " pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.422014 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.478594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrsdw\" (UniqueName: \"kubernetes.io/projected/1e2df569-0f1e-43d6-a4aa-983e7af9b753-kube-api-access-vrsdw\") pod \"nova-cell1-089c-account-create-update-cxbw6\" (UID: \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\") " pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.478656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2df569-0f1e-43d6-a4aa-983e7af9b753-operator-scripts\") pod \"nova-cell1-089c-account-create-update-cxbw6\" (UID: \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\") " pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.501034 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-rn8bz"] Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.580397 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrsdw\" (UniqueName: \"kubernetes.io/projected/1e2df569-0f1e-43d6-a4aa-983e7af9b753-kube-api-access-vrsdw\") pod \"nova-cell1-089c-account-create-update-cxbw6\" (UID: \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\") " pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.580854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2df569-0f1e-43d6-a4aa-983e7af9b753-operator-scripts\") pod \"nova-cell1-089c-account-create-update-cxbw6\" (UID: \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\") " pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.581733 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2df569-0f1e-43d6-a4aa-983e7af9b753-operator-scripts\") pod \"nova-cell1-089c-account-create-update-cxbw6\" (UID: \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\") " pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.604085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrsdw\" (UniqueName: \"kubernetes.io/projected/1e2df569-0f1e-43d6-a4aa-983e7af9b753-kube-api-access-vrsdw\") pod \"nova-cell1-089c-account-create-update-cxbw6\" (UID: \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\") " pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.660386 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.704556 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerStarted","Data":"cd4b2d7fe24bdd93d10fb596919ae826b0504292c7e202f61e2a5d2d20edfc19"} Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.704604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerStarted","Data":"4ec5ea0b7ee66425be59dda430d1dbe6b0fbbd4fcf4548aec6399f006250d63b"} Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.720405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rn8bz" event={"ID":"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d","Type":"ContainerStarted","Data":"f162421cf816c95e95d3252e46ebc48bb905c7b020c977d87771d9af6c678f2f"} Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.744604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:08 crc kubenswrapper[4749]: I1129 01:34:08.973667 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3ebb-account-create-update-nltbl"] Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.009262 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-c9grk"] Nov 29 01:34:09 crc kubenswrapper[4749]: W1129 01:34:09.016926 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39408fef_9da8_4fde_b501_421748576739.slice/crio-380b0613dc81420f744b31dc0b86e6f72ce7e88abcd5b98f414744246f0471ec WatchSource:0}: Error finding container 380b0613dc81420f744b31dc0b86e6f72ce7e88abcd5b98f414744246f0471ec: Status 404 returned error can't find the container with id 380b0613dc81420f744b31dc0b86e6f72ce7e88abcd5b98f414744246f0471ec Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.190134 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-cbs5l"] Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.319355 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8947-account-create-update-n8bjz"] Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.338674 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-089c-account-create-update-cxbw6"] Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.735826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c9grk" event={"ID":"39408fef-9da8-4fde-b501-421748576739","Type":"ContainerStarted","Data":"02683ddd55e0d8639114410723c72fa11f946bafbeb6b895d2b31282c1143425"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.736048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c9grk" event={"ID":"39408fef-9da8-4fde-b501-421748576739","Type":"ContainerStarted","Data":"380b0613dc81420f744b31dc0b86e6f72ce7e88abcd5b98f414744246f0471ec"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.739718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" event={"ID":"60229df5-f068-4b05-ba77-f02234f912f7","Type":"ContainerStarted","Data":"1d60feb658afa56447686bacf5087fdb32435f3e030225e481d08f3b86fab8ec"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.739743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" event={"ID":"60229df5-f068-4b05-ba77-f02234f912f7","Type":"ContainerStarted","Data":"fbd63e98b0d164401005e57ac76308382e89979d04f7fa84e0d2fe7293eeff46"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.750906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerStarted","Data":"a7824ea4a1031c1f65c49977bfaf87c386308a4ee677e25e7488e91b8b1a8925"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.755107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cbs5l" event={"ID":"400b5586-62b7-420f-86ea-67d92778f6f2","Type":"ContainerStarted","Data":"8d561778eb4b907693839e5fa61e32e52ffe73716ac1b9c6c7bd406db4403383"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.755138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cbs5l" event={"ID":"400b5586-62b7-420f-86ea-67d92778f6f2","Type":"ContainerStarted","Data":"ea04016a81097f461129ef17e0d918b16093e683266bfe2d5592402a0c1517c5"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.763122 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-c9grk" podStartSLOduration=2.763111915 podStartE2EDuration="2.763111915s" podCreationTimestamp="2025-11-29 01:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:09.751935516 +0000 UTC m=+1392.924085373" watchObservedRunningTime="2025-11-29 01:34:09.763111915 +0000 UTC m=+1392.935261772" Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.766771 4749 generic.go:334] "Generic (PLEG): container finished" podID="edfd4dba-036b-469a-bd29-185017dbfa55" containerID="1b79b846ba48ed32c135a2f846a3f7529ae64e566dfcbb87a5a76c965c461763" exitCode=0 Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.766835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ebb-account-create-update-nltbl" event={"ID":"edfd4dba-036b-469a-bd29-185017dbfa55","Type":"ContainerDied","Data":"1b79b846ba48ed32c135a2f846a3f7529ae64e566dfcbb87a5a76c965c461763"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.766862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ebb-account-create-update-nltbl" event={"ID":"edfd4dba-036b-469a-bd29-185017dbfa55","Type":"ContainerStarted","Data":"2e3696486712c512c7969293e3528c0d6ab376ca03cad8a3b3965e7e11f99d2c"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.768158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" event={"ID":"1e2df569-0f1e-43d6-a4aa-983e7af9b753","Type":"ContainerStarted","Data":"a5eb98b5651d00f847dc7baf4de09dc7755d80312cb6155f7d2d577d49c2b8e1"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.768185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" event={"ID":"1e2df569-0f1e-43d6-a4aa-983e7af9b753","Type":"ContainerStarted","Data":"c8c41229fb970996dd24a93cd1495fdccd196ad82c877f807368cd32c3f25bca"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.771622 4749 generic.go:334] "Generic (PLEG): container finished" podID="0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d" containerID="ee0799db0869202f475d230e6942e96a55275b4ddc42e2b4b308c9e58f27ff1f" exitCode=0 Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.771669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rn8bz" event={"ID":"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d","Type":"ContainerDied","Data":"ee0799db0869202f475d230e6942e96a55275b4ddc42e2b4b308c9e58f27ff1f"} Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.802464 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-cbs5l" podStartSLOduration=1.802443817 podStartE2EDuration="1.802443817s" podCreationTimestamp="2025-11-29 01:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:09.785317999 +0000 UTC m=+1392.957467866" watchObservedRunningTime="2025-11-29 01:34:09.802443817 +0000 UTC m=+1392.974593674" Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.808470 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" podStartSLOduration=1.808459107 podStartE2EDuration="1.808459107s" podCreationTimestamp="2025-11-29 01:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:09.774802837 +0000 UTC m=+1392.946952694" watchObservedRunningTime="2025-11-29 01:34:09.808459107 +0000 UTC m=+1392.980608964" Nov 29 01:34:09 crc kubenswrapper[4749]: I1129 01:34:09.837094 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" podStartSLOduration=1.8370786620000001 podStartE2EDuration="1.837078662s" podCreationTimestamp="2025-11-29 01:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:09.834964379 +0000 UTC m=+1393.007114236" watchObservedRunningTime="2025-11-29 01:34:09.837078662 +0000 UTC m=+1393.009228519" Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.786213 4749 generic.go:334] "Generic (PLEG): container finished" podID="1e2df569-0f1e-43d6-a4aa-983e7af9b753" containerID="a5eb98b5651d00f847dc7baf4de09dc7755d80312cb6155f7d2d577d49c2b8e1" exitCode=0 Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.786284 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" event={"ID":"1e2df569-0f1e-43d6-a4aa-983e7af9b753","Type":"ContainerDied","Data":"a5eb98b5651d00f847dc7baf4de09dc7755d80312cb6155f7d2d577d49c2b8e1"} Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.788486 4749 generic.go:334] "Generic (PLEG): container finished" podID="39408fef-9da8-4fde-b501-421748576739" containerID="02683ddd55e0d8639114410723c72fa11f946bafbeb6b895d2b31282c1143425" exitCode=0 Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.788622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c9grk" event={"ID":"39408fef-9da8-4fde-b501-421748576739","Type":"ContainerDied","Data":"02683ddd55e0d8639114410723c72fa11f946bafbeb6b895d2b31282c1143425"} Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.790891 4749 generic.go:334] "Generic (PLEG): container finished" podID="60229df5-f068-4b05-ba77-f02234f912f7" containerID="1d60feb658afa56447686bacf5087fdb32435f3e030225e481d08f3b86fab8ec" exitCode=0 Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.790928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" event={"ID":"60229df5-f068-4b05-ba77-f02234f912f7","Type":"ContainerDied","Data":"1d60feb658afa56447686bacf5087fdb32435f3e030225e481d08f3b86fab8ec"} Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.794234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerStarted","Data":"2b52f160e991611b4fa569384437e823fb44de2a911736414952d94b4cc7eca3"} Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.796136 4749 generic.go:334] "Generic (PLEG): container finished" podID="400b5586-62b7-420f-86ea-67d92778f6f2" containerID="8d561778eb4b907693839e5fa61e32e52ffe73716ac1b9c6c7bd406db4403383" exitCode=0 Nov 29 01:34:10 crc kubenswrapper[4749]: I1129 01:34:10.796412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cbs5l" event={"ID":"400b5586-62b7-420f-86ea-67d92778f6f2","Type":"ContainerDied","Data":"8d561778eb4b907693839e5fa61e32e52ffe73716ac1b9c6c7bd406db4403383"} Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.256721 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.267321 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.447445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edfd4dba-036b-469a-bd29-185017dbfa55-operator-scripts\") pod \"edfd4dba-036b-469a-bd29-185017dbfa55\" (UID: \"edfd4dba-036b-469a-bd29-185017dbfa55\") " Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.447488 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zrl4\" (UniqueName: \"kubernetes.io/projected/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-kube-api-access-5zrl4\") pod \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\" (UID: \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\") " Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.447543 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5h5m\" (UniqueName: \"kubernetes.io/projected/edfd4dba-036b-469a-bd29-185017dbfa55-kube-api-access-w5h5m\") pod \"edfd4dba-036b-469a-bd29-185017dbfa55\" (UID: \"edfd4dba-036b-469a-bd29-185017dbfa55\") " Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.447633 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-operator-scripts\") pod \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\" (UID: \"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d\") " Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.448326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edfd4dba-036b-469a-bd29-185017dbfa55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "edfd4dba-036b-469a-bd29-185017dbfa55" (UID: "edfd4dba-036b-469a-bd29-185017dbfa55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.448783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d" (UID: "0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.469381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edfd4dba-036b-469a-bd29-185017dbfa55-kube-api-access-w5h5m" (OuterVolumeSpecName: "kube-api-access-w5h5m") pod "edfd4dba-036b-469a-bd29-185017dbfa55" (UID: "edfd4dba-036b-469a-bd29-185017dbfa55"). InnerVolumeSpecName "kube-api-access-w5h5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.469932 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-kube-api-access-5zrl4" (OuterVolumeSpecName: "kube-api-access-5zrl4") pod "0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d" (UID: "0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d"). InnerVolumeSpecName "kube-api-access-5zrl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.549689 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edfd4dba-036b-469a-bd29-185017dbfa55-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.549959 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zrl4\" (UniqueName: \"kubernetes.io/projected/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-kube-api-access-5zrl4\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.549968 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5h5m\" (UniqueName: \"kubernetes.io/projected/edfd4dba-036b-469a-bd29-185017dbfa55-kube-api-access-w5h5m\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.549977 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.806439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ebb-account-create-update-nltbl" event={"ID":"edfd4dba-036b-469a-bd29-185017dbfa55","Type":"ContainerDied","Data":"2e3696486712c512c7969293e3528c0d6ab376ca03cad8a3b3965e7e11f99d2c"} Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.806479 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e3696486712c512c7969293e3528c0d6ab376ca03cad8a3b3965e7e11f99d2c" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.806449 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ebb-account-create-update-nltbl" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.808596 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-rn8bz" Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.808607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-rn8bz" event={"ID":"0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d","Type":"ContainerDied","Data":"f162421cf816c95e95d3252e46ebc48bb905c7b020c977d87771d9af6c678f2f"} Nov 29 01:34:11 crc kubenswrapper[4749]: I1129 01:34:11.808652 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f162421cf816c95e95d3252e46ebc48bb905c7b020c977d87771d9af6c678f2f" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.213594 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.362133 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.367764 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l4sj\" (UniqueName: \"kubernetes.io/projected/60229df5-f068-4b05-ba77-f02234f912f7-kube-api-access-2l4sj\") pod \"60229df5-f068-4b05-ba77-f02234f912f7\" (UID: \"60229df5-f068-4b05-ba77-f02234f912f7\") " Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.368589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60229df5-f068-4b05-ba77-f02234f912f7-operator-scripts\") pod \"60229df5-f068-4b05-ba77-f02234f912f7\" (UID: \"60229df5-f068-4b05-ba77-f02234f912f7\") " Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.370004 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60229df5-f068-4b05-ba77-f02234f912f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60229df5-f068-4b05-ba77-f02234f912f7" (UID: "60229df5-f068-4b05-ba77-f02234f912f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.372862 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60229df5-f068-4b05-ba77-f02234f912f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.385933 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60229df5-f068-4b05-ba77-f02234f912f7-kube-api-access-2l4sj" (OuterVolumeSpecName: "kube-api-access-2l4sj") pod "60229df5-f068-4b05-ba77-f02234f912f7" (UID: "60229df5-f068-4b05-ba77-f02234f912f7"). InnerVolumeSpecName "kube-api-access-2l4sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.400491 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.478921 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7wng\" (UniqueName: \"kubernetes.io/projected/400b5586-62b7-420f-86ea-67d92778f6f2-kube-api-access-g7wng\") pod \"400b5586-62b7-420f-86ea-67d92778f6f2\" (UID: \"400b5586-62b7-420f-86ea-67d92778f6f2\") " Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.479008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400b5586-62b7-420f-86ea-67d92778f6f2-operator-scripts\") pod \"400b5586-62b7-420f-86ea-67d92778f6f2\" (UID: \"400b5586-62b7-420f-86ea-67d92778f6f2\") " Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.488450 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400b5586-62b7-420f-86ea-67d92778f6f2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "400b5586-62b7-420f-86ea-67d92778f6f2" (UID: "400b5586-62b7-420f-86ea-67d92778f6f2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.491278 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l4sj\" (UniqueName: \"kubernetes.io/projected/60229df5-f068-4b05-ba77-f02234f912f7-kube-api-access-2l4sj\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.491330 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400b5586-62b7-420f-86ea-67d92778f6f2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.507065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/400b5586-62b7-420f-86ea-67d92778f6f2-kube-api-access-g7wng" (OuterVolumeSpecName: "kube-api-access-g7wng") pod "400b5586-62b7-420f-86ea-67d92778f6f2" (UID: "400b5586-62b7-420f-86ea-67d92778f6f2"). InnerVolumeSpecName "kube-api-access-g7wng". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.572688 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.594569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2df569-0f1e-43d6-a4aa-983e7af9b753-operator-scripts\") pod \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\" (UID: \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\") " Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.594733 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrsdw\" (UniqueName: \"kubernetes.io/projected/1e2df569-0f1e-43d6-a4aa-983e7af9b753-kube-api-access-vrsdw\") pod \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\" (UID: \"1e2df569-0f1e-43d6-a4aa-983e7af9b753\") " Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.595289 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7wng\" (UniqueName: \"kubernetes.io/projected/400b5586-62b7-420f-86ea-67d92778f6f2-kube-api-access-g7wng\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.595988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2df569-0f1e-43d6-a4aa-983e7af9b753-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e2df569-0f1e-43d6-a4aa-983e7af9b753" (UID: "1e2df569-0f1e-43d6-a4aa-983e7af9b753"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.600421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e2df569-0f1e-43d6-a4aa-983e7af9b753-kube-api-access-vrsdw" (OuterVolumeSpecName: "kube-api-access-vrsdw") pod "1e2df569-0f1e-43d6-a4aa-983e7af9b753" (UID: "1e2df569-0f1e-43d6-a4aa-983e7af9b753"). InnerVolumeSpecName "kube-api-access-vrsdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.696809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx6fz\" (UniqueName: \"kubernetes.io/projected/39408fef-9da8-4fde-b501-421748576739-kube-api-access-kx6fz\") pod \"39408fef-9da8-4fde-b501-421748576739\" (UID: \"39408fef-9da8-4fde-b501-421748576739\") " Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.697052 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39408fef-9da8-4fde-b501-421748576739-operator-scripts\") pod \"39408fef-9da8-4fde-b501-421748576739\" (UID: \"39408fef-9da8-4fde-b501-421748576739\") " Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.697539 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39408fef-9da8-4fde-b501-421748576739-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39408fef-9da8-4fde-b501-421748576739" (UID: "39408fef-9da8-4fde-b501-421748576739"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.697576 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2df569-0f1e-43d6-a4aa-983e7af9b753-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.697594 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrsdw\" (UniqueName: \"kubernetes.io/projected/1e2df569-0f1e-43d6-a4aa-983e7af9b753-kube-api-access-vrsdw\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.699866 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39408fef-9da8-4fde-b501-421748576739-kube-api-access-kx6fz" (OuterVolumeSpecName: "kube-api-access-kx6fz") pod "39408fef-9da8-4fde-b501-421748576739" (UID: "39408fef-9da8-4fde-b501-421748576739"). InnerVolumeSpecName "kube-api-access-kx6fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.799664 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39408fef-9da8-4fde-b501-421748576739-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.799939 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx6fz\" (UniqueName: \"kubernetes.io/projected/39408fef-9da8-4fde-b501-421748576739-kube-api-access-kx6fz\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.817449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" event={"ID":"60229df5-f068-4b05-ba77-f02234f912f7","Type":"ContainerDied","Data":"fbd63e98b0d164401005e57ac76308382e89979d04f7fa84e0d2fe7293eeff46"} Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.817489 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbd63e98b0d164401005e57ac76308382e89979d04f7fa84e0d2fe7293eeff46" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.817558 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8947-account-create-update-n8bjz" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.819788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerStarted","Data":"7334ced92ad7e7487e07d11477e12388455a9967e089ebd9586c29cd48fd9002"} Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.819850 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="ceilometer-central-agent" containerID="cri-o://cd4b2d7fe24bdd93d10fb596919ae826b0504292c7e202f61e2a5d2d20edfc19" gracePeriod=30 Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.819884 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="proxy-httpd" containerID="cri-o://7334ced92ad7e7487e07d11477e12388455a9967e089ebd9586c29cd48fd9002" gracePeriod=30 Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.819893 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="sg-core" containerID="cri-o://2b52f160e991611b4fa569384437e823fb44de2a911736414952d94b4cc7eca3" gracePeriod=30 Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.819897 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="ceilometer-notification-agent" containerID="cri-o://a7824ea4a1031c1f65c49977bfaf87c386308a4ee677e25e7488e91b8b1a8925" gracePeriod=30 Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.820104 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.823355 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-cbs5l" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.823547 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-cbs5l" event={"ID":"400b5586-62b7-420f-86ea-67d92778f6f2","Type":"ContainerDied","Data":"ea04016a81097f461129ef17e0d918b16093e683266bfe2d5592402a0c1517c5"} Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.823598 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea04016a81097f461129ef17e0d918b16093e683266bfe2d5592402a0c1517c5" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.830270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" event={"ID":"1e2df569-0f1e-43d6-a4aa-983e7af9b753","Type":"ContainerDied","Data":"c8c41229fb970996dd24a93cd1495fdccd196ad82c877f807368cd32c3f25bca"} Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.830299 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-089c-account-create-update-cxbw6" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.830306 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8c41229fb970996dd24a93cd1495fdccd196ad82c877f807368cd32c3f25bca" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.834067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c9grk" event={"ID":"39408fef-9da8-4fde-b501-421748576739","Type":"ContainerDied","Data":"380b0613dc81420f744b31dc0b86e6f72ce7e88abcd5b98f414744246f0471ec"} Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.834104 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="380b0613dc81420f744b31dc0b86e6f72ce7e88abcd5b98f414744246f0471ec" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.834120 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c9grk" Nov 29 01:34:12 crc kubenswrapper[4749]: I1129 01:34:12.847734 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.432929207 podStartE2EDuration="12.847717461s" podCreationTimestamp="2025-11-29 01:34:00 +0000 UTC" firstStartedPulling="2025-11-29 01:34:07.669246023 +0000 UTC m=+1390.841395880" lastFinishedPulling="2025-11-29 01:34:12.084034277 +0000 UTC m=+1395.256184134" observedRunningTime="2025-11-29 01:34:12.838126761 +0000 UTC m=+1396.010276618" watchObservedRunningTime="2025-11-29 01:34:12.847717461 +0000 UTC m=+1396.019867328" Nov 29 01:34:13 crc kubenswrapper[4749]: I1129 01:34:13.851817 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerID="7334ced92ad7e7487e07d11477e12388455a9967e089ebd9586c29cd48fd9002" exitCode=0 Nov 29 01:34:13 crc kubenswrapper[4749]: I1129 01:34:13.851856 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerID="2b52f160e991611b4fa569384437e823fb44de2a911736414952d94b4cc7eca3" exitCode=2 Nov 29 01:34:13 crc kubenswrapper[4749]: I1129 01:34:13.851867 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerID="a7824ea4a1031c1f65c49977bfaf87c386308a4ee677e25e7488e91b8b1a8925" exitCode=0 Nov 29 01:34:13 crc kubenswrapper[4749]: I1129 01:34:13.851864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerDied","Data":"7334ced92ad7e7487e07d11477e12388455a9967e089ebd9586c29cd48fd9002"} Nov 29 01:34:13 crc kubenswrapper[4749]: I1129 01:34:13.851931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerDied","Data":"2b52f160e991611b4fa569384437e823fb44de2a911736414952d94b4cc7eca3"} Nov 29 01:34:13 crc kubenswrapper[4749]: I1129 01:34:13.851949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerDied","Data":"a7824ea4a1031c1f65c49977bfaf87c386308a4ee677e25e7488e91b8b1a8925"} Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.201447 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8f2xf"] Nov 29 01:34:18 crc kubenswrapper[4749]: E1129 01:34:18.202249 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2df569-0f1e-43d6-a4aa-983e7af9b753" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202263 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2df569-0f1e-43d6-a4aa-983e7af9b753" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: E1129 01:34:18.202291 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202298 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: E1129 01:34:18.202311 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60229df5-f068-4b05-ba77-f02234f912f7" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202317 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="60229df5-f068-4b05-ba77-f02234f912f7" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: E1129 01:34:18.202333 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39408fef-9da8-4fde-b501-421748576739" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202339 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39408fef-9da8-4fde-b501-421748576739" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: E1129 01:34:18.202353 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400b5586-62b7-420f-86ea-67d92778f6f2" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202359 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="400b5586-62b7-420f-86ea-67d92778f6f2" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: E1129 01:34:18.202371 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edfd4dba-036b-469a-bd29-185017dbfa55" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202377 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="edfd4dba-036b-469a-bd29-185017dbfa55" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202548 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="60229df5-f068-4b05-ba77-f02234f912f7" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202569 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="39408fef-9da8-4fde-b501-421748576739" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202583 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="400b5586-62b7-420f-86ea-67d92778f6f2" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202610 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="edfd4dba-036b-469a-bd29-185017dbfa55" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202620 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d" containerName="mariadb-database-create" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.202632 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e2df569-0f1e-43d6-a4aa-983e7af9b753" containerName="mariadb-account-create-update" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.203342 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.206563 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.213591 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8f2xf"] Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.214358 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.214675 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2xnnz" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.300650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8sw7\" (UniqueName: \"kubernetes.io/projected/e6d35e2c-828c-4e65-bd18-117a1b053783-kube-api-access-z8sw7\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.301054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-config-data\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.301081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-scripts\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.301103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.403305 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sw7\" (UniqueName: \"kubernetes.io/projected/e6d35e2c-828c-4e65-bd18-117a1b053783-kube-api-access-z8sw7\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.403489 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-config-data\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.403527 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-scripts\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.403555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.418338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-scripts\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.418461 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.419273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-config-data\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.429188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8sw7\" (UniqueName: \"kubernetes.io/projected/e6d35e2c-828c-4e65-bd18-117a1b053783-kube-api-access-z8sw7\") pod \"nova-cell0-conductor-db-sync-8f2xf\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:18 crc kubenswrapper[4749]: I1129 01:34:18.522424 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:19 crc kubenswrapper[4749]: I1129 01:34:19.064291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8f2xf"] Nov 29 01:34:19 crc kubenswrapper[4749]: W1129 01:34:19.076727 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d35e2c_828c_4e65_bd18_117a1b053783.slice/crio-f7983c46a9bb790e82a5ef863da91d5c59edc804b962ef5a52622e28b87fd329 WatchSource:0}: Error finding container f7983c46a9bb790e82a5ef863da91d5c59edc804b962ef5a52622e28b87fd329: Status 404 returned error can't find the container with id f7983c46a9bb790e82a5ef863da91d5c59edc804b962ef5a52622e28b87fd329 Nov 29 01:34:19 crc kubenswrapper[4749]: I1129 01:34:19.919410 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerID="cd4b2d7fe24bdd93d10fb596919ae826b0504292c7e202f61e2a5d2d20edfc19" exitCode=0 Nov 29 01:34:19 crc kubenswrapper[4749]: I1129 01:34:19.919573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerDied","Data":"cd4b2d7fe24bdd93d10fb596919ae826b0504292c7e202f61e2a5d2d20edfc19"} Nov 29 01:34:19 crc kubenswrapper[4749]: I1129 01:34:19.919831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a","Type":"ContainerDied","Data":"4ec5ea0b7ee66425be59dda430d1dbe6b0fbbd4fcf4548aec6399f006250d63b"} Nov 29 01:34:19 crc kubenswrapper[4749]: I1129 01:34:19.919858 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ec5ea0b7ee66425be59dda430d1dbe6b0fbbd4fcf4548aec6399f006250d63b" Nov 29 01:34:19 crc kubenswrapper[4749]: I1129 01:34:19.922289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" event={"ID":"e6d35e2c-828c-4e65-bd18-117a1b053783","Type":"ContainerStarted","Data":"f7983c46a9bb790e82a5ef863da91d5c59edc804b962ef5a52622e28b87fd329"} Nov 29 01:34:19 crc kubenswrapper[4749]: I1129 01:34:19.956872 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.138970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmrjd\" (UniqueName: \"kubernetes.io/projected/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-kube-api-access-hmrjd\") pod \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.139070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-combined-ca-bundle\") pod \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.139162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-sg-core-conf-yaml\") pod \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.139259 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-scripts\") pod \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.139358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-run-httpd\") pod \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.139437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-config-data\") pod \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.139482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-log-httpd\") pod \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\" (UID: \"d3fb6c0c-0956-4000-94ad-6c62f8da8b5a\") " Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.140072 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" (UID: "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.140064 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" (UID: "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.156262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-scripts" (OuterVolumeSpecName: "scripts") pod "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" (UID: "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.156885 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-kube-api-access-hmrjd" (OuterVolumeSpecName: "kube-api-access-hmrjd") pod "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" (UID: "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a"). InnerVolumeSpecName "kube-api-access-hmrjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.242747 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.242793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmrjd\" (UniqueName: \"kubernetes.io/projected/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-kube-api-access-hmrjd\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.242813 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.242832 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.249482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" (UID: "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.279684 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" (UID: "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.321970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-config-data" (OuterVolumeSpecName: "config-data") pod "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" (UID: "d3fb6c0c-0956-4000-94ad-6c62f8da8b5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.344291 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.344326 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.344341 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.935151 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:20 crc kubenswrapper[4749]: I1129 01:34:20.981790 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.003778 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.021640 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:21 crc kubenswrapper[4749]: E1129 01:34:21.022096 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="ceilometer-notification-agent" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.022117 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="ceilometer-notification-agent" Nov 29 01:34:21 crc kubenswrapper[4749]: E1129 01:34:21.022137 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="ceilometer-central-agent" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.022146 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="ceilometer-central-agent" Nov 29 01:34:21 crc kubenswrapper[4749]: E1129 01:34:21.022163 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="sg-core" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.022169 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="sg-core" Nov 29 01:34:21 crc kubenswrapper[4749]: E1129 01:34:21.022192 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="proxy-httpd" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.022216 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="proxy-httpd" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.022443 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="sg-core" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.022464 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="ceilometer-notification-agent" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.022474 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="ceilometer-central-agent" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.022491 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" containerName="proxy-httpd" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.024540 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.032489 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.033154 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.035369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.094978 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3fb6c0c-0956-4000-94ad-6c62f8da8b5a" path="/var/lib/kubelet/pods/d3fb6c0c-0956-4000-94ad-6c62f8da8b5a/volumes" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.159507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-config-data\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.159570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-scripts\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.159609 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.159629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44wgf\" (UniqueName: \"kubernetes.io/projected/8851f5e9-6cbe-4078-a532-2735e8c5ce73-kube-api-access-44wgf\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.159657 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-log-httpd\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.159679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-run-httpd\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.160007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.262153 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44wgf\" (UniqueName: \"kubernetes.io/projected/8851f5e9-6cbe-4078-a532-2735e8c5ce73-kube-api-access-44wgf\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.262226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-log-httpd\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.262264 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-run-httpd\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.262344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.262404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-config-data\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.262461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-scripts\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.262506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.262765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-log-httpd\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.263400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-run-httpd\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.267700 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.277119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.277258 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-scripts\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.277775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-config-data\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.281939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44wgf\" (UniqueName: \"kubernetes.io/projected/8851f5e9-6cbe-4078-a532-2735e8c5ce73-kube-api-access-44wgf\") pod \"ceilometer-0\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.349375 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:21 crc kubenswrapper[4749]: W1129 01:34:21.852518 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8851f5e9_6cbe_4078_a532_2735e8c5ce73.slice/crio-84ca22e683d9135b8e06199dd2b63ccc8b05b749388bf7525d7764fbbfa1ee7e WatchSource:0}: Error finding container 84ca22e683d9135b8e06199dd2b63ccc8b05b749388bf7525d7764fbbfa1ee7e: Status 404 returned error can't find the container with id 84ca22e683d9135b8e06199dd2b63ccc8b05b749388bf7525d7764fbbfa1ee7e Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.862015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:21 crc kubenswrapper[4749]: I1129 01:34:21.947815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerStarted","Data":"84ca22e683d9135b8e06199dd2b63ccc8b05b749388bf7525d7764fbbfa1ee7e"} Nov 29 01:34:24 crc kubenswrapper[4749]: I1129 01:34:24.610487 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:28 crc kubenswrapper[4749]: I1129 01:34:28.021290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" event={"ID":"e6d35e2c-828c-4e65-bd18-117a1b053783","Type":"ContainerStarted","Data":"3921629adb03e7df47ffbd7236135a367c35c210410b3590dffa731dd7125961"} Nov 29 01:34:28 crc kubenswrapper[4749]: I1129 01:34:28.023584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerStarted","Data":"be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0"} Nov 29 01:34:28 crc kubenswrapper[4749]: I1129 01:34:28.023621 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerStarted","Data":"67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42"} Nov 29 01:34:28 crc kubenswrapper[4749]: I1129 01:34:28.044400 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" podStartSLOduration=2.2523657950000002 podStartE2EDuration="10.044375617s" podCreationTimestamp="2025-11-29 01:34:18 +0000 UTC" firstStartedPulling="2025-11-29 01:34:19.081351061 +0000 UTC m=+1402.253500918" lastFinishedPulling="2025-11-29 01:34:26.873360863 +0000 UTC m=+1410.045510740" observedRunningTime="2025-11-29 01:34:28.039127386 +0000 UTC m=+1411.211277243" watchObservedRunningTime="2025-11-29 01:34:28.044375617 +0000 UTC m=+1411.216525504" Nov 29 01:34:30 crc kubenswrapper[4749]: I1129 01:34:30.052145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerStarted","Data":"aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d"} Nov 29 01:34:30 crc kubenswrapper[4749]: I1129 01:34:30.252306 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:34:30 crc kubenswrapper[4749]: I1129 01:34:30.252593 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerName="glance-log" containerID="cri-o://63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce" gracePeriod=30 Nov 29 01:34:30 crc kubenswrapper[4749]: I1129 01:34:30.252653 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerName="glance-httpd" containerID="cri-o://0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834" gracePeriod=30 Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.064286 4749 generic.go:334] "Generic (PLEG): container finished" podID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerID="63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce" exitCode=143 Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.064336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a","Type":"ContainerDied","Data":"63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce"} Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.067931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerStarted","Data":"06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7"} Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.068080 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="ceilometer-central-agent" containerID="cri-o://67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42" gracePeriod=30 Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.068105 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.068139 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="sg-core" containerID="cri-o://aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d" gracePeriod=30 Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.068187 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="ceilometer-notification-agent" containerID="cri-o://be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0" gracePeriod=30 Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.068348 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="proxy-httpd" containerID="cri-o://06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7" gracePeriod=30 Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.105484 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.369006866 podStartE2EDuration="11.105468256s" podCreationTimestamp="2025-11-29 01:34:20 +0000 UTC" firstStartedPulling="2025-11-29 01:34:21.857503586 +0000 UTC m=+1405.029653443" lastFinishedPulling="2025-11-29 01:34:30.593964976 +0000 UTC m=+1413.766114833" observedRunningTime="2025-11-29 01:34:31.105101416 +0000 UTC m=+1414.277251303" watchObservedRunningTime="2025-11-29 01:34:31.105468256 +0000 UTC m=+1414.277618113" Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.774585 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.774916 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-log" containerID="cri-o://a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2" gracePeriod=30 Nov 29 01:34:31 crc kubenswrapper[4749]: I1129 01:34:31.774990 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-httpd" containerID="cri-o://49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4" gracePeriod=30 Nov 29 01:34:32 crc kubenswrapper[4749]: I1129 01:34:32.079859 4749 generic.go:334] "Generic (PLEG): container finished" podID="b390a04f-fb35-4166-af23-0b735e2f5266" containerID="a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2" exitCode=143 Nov 29 01:34:32 crc kubenswrapper[4749]: I1129 01:34:32.079942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b390a04f-fb35-4166-af23-0b735e2f5266","Type":"ContainerDied","Data":"a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2"} Nov 29 01:34:32 crc kubenswrapper[4749]: I1129 01:34:32.082768 4749 generic.go:334] "Generic (PLEG): container finished" podID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerID="06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7" exitCode=0 Nov 29 01:34:32 crc kubenswrapper[4749]: I1129 01:34:32.082794 4749 generic.go:334] "Generic (PLEG): container finished" podID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerID="aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d" exitCode=2 Nov 29 01:34:32 crc kubenswrapper[4749]: I1129 01:34:32.082803 4749 generic.go:334] "Generic (PLEG): container finished" podID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerID="be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0" exitCode=0 Nov 29 01:34:32 crc kubenswrapper[4749]: I1129 01:34:32.082820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerDied","Data":"06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7"} Nov 29 01:34:32 crc kubenswrapper[4749]: I1129 01:34:32.082843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerDied","Data":"aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d"} Nov 29 01:34:32 crc kubenswrapper[4749]: I1129 01:34:32.082853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerDied","Data":"be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0"} Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.044592 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.049843 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.114833 4749 generic.go:334] "Generic (PLEG): container finished" podID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerID="0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834" exitCode=0 Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.114913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a","Type":"ContainerDied","Data":"0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834"} Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.114946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a","Type":"ContainerDied","Data":"074d319b9bc89fa3dc87e67d039854264afb00b251bc2b5d1fbe7402a12bfbb8"} Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.114968 4749 scope.go:117] "RemoveContainer" containerID="0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.115166 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.126948 4749 generic.go:334] "Generic (PLEG): container finished" podID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerID="67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42" exitCode=0 Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.126995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerDied","Data":"67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42"} Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.127054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8851f5e9-6cbe-4078-a532-2735e8c5ce73","Type":"ContainerDied","Data":"84ca22e683d9135b8e06199dd2b63ccc8b05b749388bf7525d7764fbbfa1ee7e"} Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.127125 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.165913 4749 scope.go:117] "RemoveContainer" containerID="63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.169048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-combined-ca-bundle\") pod \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.169092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-run-httpd\") pod \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.169131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-scripts\") pod \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.169160 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-httpd-run\") pod \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170033 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw8f6\" (UniqueName: \"kubernetes.io/projected/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-kube-api-access-mw8f6\") pod \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-logs\") pod \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-config-data\") pod \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170175 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-combined-ca-bundle\") pod \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170252 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-scripts\") pod \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44wgf\" (UniqueName: \"kubernetes.io/projected/8851f5e9-6cbe-4078-a532-2735e8c5ce73-kube-api-access-44wgf\") pod \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-log-httpd\") pod \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170324 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-public-tls-certs\") pod \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-config-data\") pod \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\" (UID: \"73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170394 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-sg-core-conf-yaml\") pod \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\" (UID: \"8851f5e9-6cbe-4078-a532-2735e8c5ce73\") " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170629 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" (UID: "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.170916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8851f5e9-6cbe-4078-a532-2735e8c5ce73" (UID: "8851f5e9-6cbe-4078-a532-2735e8c5ce73"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.171119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-logs" (OuterVolumeSpecName: "logs") pod "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" (UID: "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.171357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8851f5e9-6cbe-4078-a532-2735e8c5ce73" (UID: "8851f5e9-6cbe-4078-a532-2735e8c5ce73"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.177998 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-kube-api-access-mw8f6" (OuterVolumeSpecName: "kube-api-access-mw8f6") pod "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" (UID: "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a"). InnerVolumeSpecName "kube-api-access-mw8f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.178061 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-scripts" (OuterVolumeSpecName: "scripts") pod "8851f5e9-6cbe-4078-a532-2735e8c5ce73" (UID: "8851f5e9-6cbe-4078-a532-2735e8c5ce73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.181990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8851f5e9-6cbe-4078-a532-2735e8c5ce73-kube-api-access-44wgf" (OuterVolumeSpecName: "kube-api-access-44wgf") pod "8851f5e9-6cbe-4078-a532-2735e8c5ce73" (UID: "8851f5e9-6cbe-4078-a532-2735e8c5ce73"). InnerVolumeSpecName "kube-api-access-44wgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.187576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" (UID: "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.191391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-scripts" (OuterVolumeSpecName: "scripts") pod "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" (UID: "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.216990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" (UID: "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.255031 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8851f5e9-6cbe-4078-a532-2735e8c5ce73" (UID: "8851f5e9-6cbe-4078-a532-2735e8c5ce73"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.259190 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-config-data" (OuterVolumeSpecName: "config-data") pod "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" (UID: "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.259546 4749 scope.go:117] "RemoveContainer" containerID="0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.259987 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834\": container with ID starting with 0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834 not found: ID does not exist" containerID="0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.260032 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834"} err="failed to get container status \"0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834\": rpc error: code = NotFound desc = could not find container \"0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834\": container with ID starting with 0efc5a47e1bdd06ceead33c2bd6d3611b6f7a2140013ebc131db47d89c468834 not found: ID does not exist" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.260060 4749 scope.go:117] "RemoveContainer" containerID="63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.260510 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce\": container with ID starting with 63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce not found: ID does not exist" containerID="63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.260538 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce"} err="failed to get container status \"63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce\": rpc error: code = NotFound desc = could not find container \"63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce\": container with ID starting with 63b08b3aee0547693bee67e921305808409e2b6f2cfdd5e7ce2353cba25e42ce not found: ID does not exist" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.260562 4749 scope.go:117] "RemoveContainer" containerID="06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276733 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276766 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276775 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276783 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276805 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276814 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw8f6\" (UniqueName: \"kubernetes.io/projected/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-kube-api-access-mw8f6\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276824 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276852 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276861 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276869 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44wgf\" (UniqueName: \"kubernetes.io/projected/8851f5e9-6cbe-4078-a532-2735e8c5ce73-kube-api-access-44wgf\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276877 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8851f5e9-6cbe-4078-a532-2735e8c5ce73-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.276885 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.281100 4749 scope.go:117] "RemoveContainer" containerID="aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.286845 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8851f5e9-6cbe-4078-a532-2735e8c5ce73" (UID: "8851f5e9-6cbe-4078-a532-2735e8c5ce73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.297042 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" (UID: "73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.311820 4749 scope.go:117] "RemoveContainer" containerID="be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.312299 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.339885 4749 scope.go:117] "RemoveContainer" containerID="67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.346881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-config-data" (OuterVolumeSpecName: "config-data") pod "8851f5e9-6cbe-4078-a532-2735e8c5ce73" (UID: "8851f5e9-6cbe-4078-a532-2735e8c5ce73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.360972 4749 scope.go:117] "RemoveContainer" containerID="06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.361561 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7\": container with ID starting with 06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7 not found: ID does not exist" containerID="06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.361604 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7"} err="failed to get container status \"06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7\": rpc error: code = NotFound desc = could not find container \"06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7\": container with ID starting with 06065ef1dbeea7659d9cdfdada981dc79046e50c1e9ea8ed99a36cae6a262fb7 not found: ID does not exist" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.361633 4749 scope.go:117] "RemoveContainer" containerID="aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.362008 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d\": container with ID starting with aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d not found: ID does not exist" containerID="aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.362028 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d"} err="failed to get container status \"aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d\": rpc error: code = NotFound desc = could not find container \"aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d\": container with ID starting with aa9e8ccc0c556a81bd488c5ac71f7d02fe40632fafb53b410a66272db10c530d not found: ID does not exist" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.362041 4749 scope.go:117] "RemoveContainer" containerID="be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.362281 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0\": container with ID starting with be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0 not found: ID does not exist" containerID="be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.362301 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0"} err="failed to get container status \"be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0\": rpc error: code = NotFound desc = could not find container \"be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0\": container with ID starting with be875750ab4dd85220f7451bff32b9a2aac5054e53dbcb17153651aaf6b68ea0 not found: ID does not exist" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.362314 4749 scope.go:117] "RemoveContainer" containerID="67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.362709 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42\": container with ID starting with 67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42 not found: ID does not exist" containerID="67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.362732 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42"} err="failed to get container status \"67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42\": rpc error: code = NotFound desc = could not find container \"67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42\": container with ID starting with 67c6bef9b00ea2f9ad097e1d5a27f88848cf041f7bb26c1050606735ba622f42 not found: ID does not exist" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.378361 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.378405 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.378417 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.378428 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8851f5e9-6cbe-4078-a532-2735e8c5ce73-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.449413 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.460775 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.472000 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.485929 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.503076 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.503579 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerName="glance-log" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.503612 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerName="glance-log" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.503634 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="ceilometer-central-agent" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.503649 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="ceilometer-central-agent" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.503682 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="ceilometer-notification-agent" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.503695 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="ceilometer-notification-agent" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.503738 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="sg-core" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.503750 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="sg-core" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.503784 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="proxy-httpd" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.503811 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="proxy-httpd" Nov 29 01:34:34 crc kubenswrapper[4749]: E1129 01:34:34.503834 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerName="glance-httpd" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.503846 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerName="glance-httpd" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.504141 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="ceilometer-central-agent" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.504362 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="proxy-httpd" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.504389 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="ceilometer-notification-agent" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.504436 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerName="glance-log" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.504449 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" containerName="sg-core" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.504466 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" containerName="glance-httpd" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.505958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.508734 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.508948 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.516618 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.518862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.533333 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.533521 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.535456 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.553679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.692926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-log-httpd\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.692999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4x7q\" (UniqueName: \"kubernetes.io/projected/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-kube-api-access-g4x7q\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-config-data\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-scripts\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-run-httpd\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693161 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693183 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q8gj\" (UniqueName: \"kubernetes.io/projected/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-kube-api-access-8q8gj\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693334 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693357 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-logs\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.693395 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.794510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.794567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.794610 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q8gj\" (UniqueName: \"kubernetes.io/projected/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-kube-api-access-8q8gj\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.794816 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.794876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.794927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.794948 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-logs\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-log-httpd\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795087 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4x7q\" (UniqueName: \"kubernetes.io/projected/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-kube-api-access-g4x7q\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-config-data\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-scripts\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-run-httpd\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.795959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-logs\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.794941 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.798224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-log-httpd\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.798555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-run-httpd\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.801158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.801256 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.801773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-config-data\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.809985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.810109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-scripts\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.810823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.811316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.812847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.814699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4x7q\" (UniqueName: \"kubernetes.io/projected/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-kube-api-access-g4x7q\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.822925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q8gj\" (UniqueName: \"kubernetes.io/projected/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-kube-api-access-8q8gj\") pod \"ceilometer-0\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.828470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.882802 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.894704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.954585 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.144:9292/healthcheck\": read tcp 10.217.0.2:56364->10.217.0.144:9292: read: connection reset by peer" Nov 29 01:34:34 crc kubenswrapper[4749]: I1129 01:34:34.954653 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.144:9292/healthcheck\": read tcp 10.217.0.2:56366->10.217.0.144:9292: read: connection reset by peer" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.092246 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a" path="/var/lib/kubelet/pods/73a75b36-80f4-4d9c-9e2f-d5eb8a09ba4a/volumes" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.093441 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8851f5e9-6cbe-4078-a532-2735e8c5ce73" path="/var/lib/kubelet/pods/8851f5e9-6cbe-4078-a532-2735e8c5ce73/volumes" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.354281 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4rss4"] Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.356665 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.368378 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rss4"] Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.402625 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:34:35 crc kubenswrapper[4749]: W1129 01:34:35.420073 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb7bd5a0_4cec_497b_8acc_3ceebc100bed.slice/crio-bd0634fd128fb1a9dd76d925390a9326ceb90cccf80da049b56f729e306491a1 WatchSource:0}: Error finding container bd0634fd128fb1a9dd76d925390a9326ceb90cccf80da049b56f729e306491a1: Status 404 returned error can't find the container with id bd0634fd128fb1a9dd76d925390a9326ceb90cccf80da049b56f729e306491a1 Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.509431 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjp54\" (UniqueName: \"kubernetes.io/projected/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-kube-api-access-cjp54\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.509509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-utilities\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.509544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-catalog-content\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.526590 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:34:35 crc kubenswrapper[4749]: W1129 01:34:35.536360 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff73098b_8f03_43ac_9e1d_3ac7edd2589d.slice/crio-7eefed1f9a37f25e0cea60beaf548d1974bfd1e9132a54bed7b64b56c7b2c78e WatchSource:0}: Error finding container 7eefed1f9a37f25e0cea60beaf548d1974bfd1e9132a54bed7b64b56c7b2c78e: Status 404 returned error can't find the container with id 7eefed1f9a37f25e0cea60beaf548d1974bfd1e9132a54bed7b64b56c7b2c78e Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.614566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-utilities\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.614620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-catalog-content\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.614717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjp54\" (UniqueName: \"kubernetes.io/projected/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-kube-api-access-cjp54\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.615315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-utilities\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.615415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-catalog-content\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.657130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjp54\" (UniqueName: \"kubernetes.io/projected/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-kube-api-access-cjp54\") pod \"community-operators-4rss4\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:35 crc kubenswrapper[4749]: I1129 01:34:35.674685 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.017528 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.141425 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c28vk\" (UniqueName: \"kubernetes.io/projected/b390a04f-fb35-4166-af23-0b735e2f5266-kube-api-access-c28vk\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.141526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-httpd-run\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.141572 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-scripts\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.141638 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.141686 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-combined-ca-bundle\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.141709 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-logs\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.141727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.141746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-config-data\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.142131 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.145903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-logs" (OuterVolumeSpecName: "logs") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.147051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b390a04f-fb35-4166-af23-0b735e2f5266-kube-api-access-c28vk" (OuterVolumeSpecName: "kube-api-access-c28vk") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266"). InnerVolumeSpecName "kube-api-access-c28vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.148841 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-scripts" (OuterVolumeSpecName: "scripts") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.151348 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.159880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerStarted","Data":"bd0634fd128fb1a9dd76d925390a9326ceb90cccf80da049b56f729e306491a1"} Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.183690 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.191013 4749 generic.go:334] "Generic (PLEG): container finished" podID="b390a04f-fb35-4166-af23-0b735e2f5266" containerID="49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4" exitCode=0 Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.191076 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.191085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b390a04f-fb35-4166-af23-0b735e2f5266","Type":"ContainerDied","Data":"49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4"} Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.191148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b390a04f-fb35-4166-af23-0b735e2f5266","Type":"ContainerDied","Data":"547a4afd7f21e5d654d8e0386c8b5c88797377a66e39773b43f77bc6eaa388b3"} Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.191164 4749 scope.go:117] "RemoveContainer" containerID="49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.194510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff73098b-8f03-43ac-9e1d-3ac7edd2589d","Type":"ContainerStarted","Data":"7eefed1f9a37f25e0cea60beaf548d1974bfd1e9132a54bed7b64b56c7b2c78e"} Nov 29 01:34:36 crc kubenswrapper[4749]: E1129 01:34:36.203621 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs podName:b390a04f-fb35-4166-af23-0b735e2f5266 nodeName:}" failed. No retries permitted until 2025-11-29 01:34:36.703570287 +0000 UTC m=+1419.875720144 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266") : error deleting /var/lib/kubelet/pods/b390a04f-fb35-4166-af23-0b735e2f5266/volume-subpaths: remove /var/lib/kubelet/pods/b390a04f-fb35-4166-af23-0b735e2f5266/volume-subpaths: no such file or directory Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.206467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-config-data" (OuterVolumeSpecName: "config-data") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.230116 4749 scope.go:117] "RemoveContainer" containerID="a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.243536 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c28vk\" (UniqueName: \"kubernetes.io/projected/b390a04f-fb35-4166-af23-0b735e2f5266-kube-api-access-c28vk\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.243562 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.243571 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.243580 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.243588 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b390a04f-fb35-4166-af23-0b735e2f5266-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.243616 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.243626 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.262706 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rss4"] Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.268239 4749 scope.go:117] "RemoveContainer" containerID="49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4" Nov 29 01:34:36 crc kubenswrapper[4749]: E1129 01:34:36.273142 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4\": container with ID starting with 49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4 not found: ID does not exist" containerID="49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.273186 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4"} err="failed to get container status \"49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4\": rpc error: code = NotFound desc = could not find container \"49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4\": container with ID starting with 49ef74216ff7dfe33bf5a8fb18f91bbf3bde4c8353502e88999e05ec12782ad4 not found: ID does not exist" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.273240 4749 scope.go:117] "RemoveContainer" containerID="a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2" Nov 29 01:34:36 crc kubenswrapper[4749]: E1129 01:34:36.277376 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2\": container with ID starting with a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2 not found: ID does not exist" containerID="a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.277422 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2"} err="failed to get container status \"a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2\": rpc error: code = NotFound desc = could not find container \"a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2\": container with ID starting with a415b1fcc8fcbdffcf658059b0bb38ffb08a21b07ddbe410edae8def6e2d87c2 not found: ID does not exist" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.284450 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 29 01:34:36 crc kubenswrapper[4749]: W1129 01:34:36.289121 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2a2ec6c_3fed_4f0b_b873_d2feccbd31e2.slice/crio-5e215f34852aa3e80eafa44101166c3ebe73088189e68d09d288b502436cf043 WatchSource:0}: Error finding container 5e215f34852aa3e80eafa44101166c3ebe73088189e68d09d288b502436cf043: Status 404 returned error can't find the container with id 5e215f34852aa3e80eafa44101166c3ebe73088189e68d09d288b502436cf043 Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.349597 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.754914 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs\") pod \"b390a04f-fb35-4166-af23-0b735e2f5266\" (UID: \"b390a04f-fb35-4166-af23-0b735e2f5266\") " Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.759343 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b390a04f-fb35-4166-af23-0b735e2f5266" (UID: "b390a04f-fb35-4166-af23-0b735e2f5266"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.835945 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.846330 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.857809 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b390a04f-fb35-4166-af23-0b735e2f5266-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.865526 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:34:36 crc kubenswrapper[4749]: E1129 01:34:36.866440 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-log" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.866466 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-log" Nov 29 01:34:36 crc kubenswrapper[4749]: E1129 01:34:36.866520 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-httpd" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.866528 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-httpd" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.866776 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-httpd" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.866807 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" containerName="glance-log" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.868422 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.873329 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.873658 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.896314 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.959487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.959601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.959641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.959674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.959731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.959754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppsgf\" (UniqueName: \"kubernetes.io/projected/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-kube-api-access-ppsgf\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.959782 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:36 crc kubenswrapper[4749]: I1129 01:34:36.959813 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.062337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.062431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.062455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppsgf\" (UniqueName: \"kubernetes.io/projected/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-kube-api-access-ppsgf\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.062487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.062530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.062577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.062648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.062697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.063298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.063675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-logs\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.065962 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.070358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.071566 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.076110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.081834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppsgf\" (UniqueName: \"kubernetes.io/projected/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-kube-api-access-ppsgf\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.081881 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.091275 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b390a04f-fb35-4166-af23-0b735e2f5266" path="/var/lib/kubelet/pods/b390a04f-fb35-4166-af23-0b735e2f5266/volumes" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.108029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.204764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff73098b-8f03-43ac-9e1d-3ac7edd2589d","Type":"ContainerStarted","Data":"81bb8b04fc234401df04769295fca821df1009b777f50ebd985ac7c880a1e11d"} Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.205765 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff73098b-8f03-43ac-9e1d-3ac7edd2589d","Type":"ContainerStarted","Data":"f72654b1f6476b877d4a3b3f9bd36b7ac0424fa98f947f3608445049745b170c"} Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.206621 4749 generic.go:334] "Generic (PLEG): container finished" podID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerID="e3335477f9348fd3750c703649f4ae9adbf222a3324d7ee206726e935ed5ff85" exitCode=0 Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.206680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rss4" event={"ID":"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2","Type":"ContainerDied","Data":"e3335477f9348fd3750c703649f4ae9adbf222a3324d7ee206726e935ed5ff85"} Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.206696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rss4" event={"ID":"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2","Type":"ContainerStarted","Data":"5e215f34852aa3e80eafa44101166c3ebe73088189e68d09d288b502436cf043"} Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.208586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerStarted","Data":"9d6b37a0cc7a55b62af9b53fbd6409aeac873f1dff484875a19de3e104a7cb9d"} Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.208636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerStarted","Data":"508dd2be6346cf1635cd7779cf10c81bb796918342597ca0af5e28844eb99452"} Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.232168 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.232147346 podStartE2EDuration="3.232147346s" podCreationTimestamp="2025-11-29 01:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:37.223857049 +0000 UTC m=+1420.396006906" watchObservedRunningTime="2025-11-29 01:34:37.232147346 +0000 UTC m=+1420.404297213" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.294106 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:37 crc kubenswrapper[4749]: I1129 01:34:37.915226 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:34:38 crc kubenswrapper[4749]: I1129 01:34:38.233234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerStarted","Data":"05903097a37c9fa98574c40f91838913a16b798c60e11bd1347643979635c559"} Nov 29 01:34:38 crc kubenswrapper[4749]: I1129 01:34:38.234837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8","Type":"ContainerStarted","Data":"416ec9c7231deeac6705ee0bfb2bb8c5d8b38be6094685561cfe5fe231cdac45"} Nov 29 01:34:38 crc kubenswrapper[4749]: I1129 01:34:38.239520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rss4" event={"ID":"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2","Type":"ContainerStarted","Data":"a02f7bc95a94374364e88a78dfb2819c6c1a934b19ec3d15f5c35a0480132105"} Nov 29 01:34:39 crc kubenswrapper[4749]: I1129 01:34:39.253897 4749 generic.go:334] "Generic (PLEG): container finished" podID="e6d35e2c-828c-4e65-bd18-117a1b053783" containerID="3921629adb03e7df47ffbd7236135a367c35c210410b3590dffa731dd7125961" exitCode=0 Nov 29 01:34:39 crc kubenswrapper[4749]: I1129 01:34:39.253969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" event={"ID":"e6d35e2c-828c-4e65-bd18-117a1b053783","Type":"ContainerDied","Data":"3921629adb03e7df47ffbd7236135a367c35c210410b3590dffa731dd7125961"} Nov 29 01:34:39 crc kubenswrapper[4749]: I1129 01:34:39.258103 4749 generic.go:334] "Generic (PLEG): container finished" podID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerID="a02f7bc95a94374364e88a78dfb2819c6c1a934b19ec3d15f5c35a0480132105" exitCode=0 Nov 29 01:34:39 crc kubenswrapper[4749]: I1129 01:34:39.258146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rss4" event={"ID":"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2","Type":"ContainerDied","Data":"a02f7bc95a94374364e88a78dfb2819c6c1a934b19ec3d15f5c35a0480132105"} Nov 29 01:34:39 crc kubenswrapper[4749]: I1129 01:34:39.263266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8","Type":"ContainerStarted","Data":"81a360b2af0a80524dd327e6ff413357d3ae93289f88811a965e85a9ada36073"} Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.278755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8","Type":"ContainerStarted","Data":"60d404c089766276e2b776b298e5f3050db4dd04153458e59685a427fe248694"} Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.282387 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rss4" event={"ID":"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2","Type":"ContainerStarted","Data":"1fd08de4ea6651ede49c8cca90845b37862495f840aa6ba8adc4e63fe642cd3b"} Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.291995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerStarted","Data":"dce0dcf10cdf3973e3c9589cc01bb169f72a8615aa9262621e47766e0365d0ee"} Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.292191 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.314812 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.314791202 podStartE2EDuration="4.314791202s" podCreationTimestamp="2025-11-29 01:34:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:40.306652269 +0000 UTC m=+1423.478802166" watchObservedRunningTime="2025-11-29 01:34:40.314791202 +0000 UTC m=+1423.486941069" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.346443 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4rss4" podStartSLOduration=2.84515164 podStartE2EDuration="5.346399912s" podCreationTimestamp="2025-11-29 01:34:35 +0000 UTC" firstStartedPulling="2025-11-29 01:34:37.208240799 +0000 UTC m=+1420.380390656" lastFinishedPulling="2025-11-29 01:34:39.709489071 +0000 UTC m=+1422.881638928" observedRunningTime="2025-11-29 01:34:40.337639863 +0000 UTC m=+1423.509789760" watchObservedRunningTime="2025-11-29 01:34:40.346399912 +0000 UTC m=+1423.518549809" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.381893 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.767801723 podStartE2EDuration="6.381872577s" podCreationTimestamp="2025-11-29 01:34:34 +0000 UTC" firstStartedPulling="2025-11-29 01:34:35.423179485 +0000 UTC m=+1418.595329342" lastFinishedPulling="2025-11-29 01:34:39.037250339 +0000 UTC m=+1422.209400196" observedRunningTime="2025-11-29 01:34:40.369141359 +0000 UTC m=+1423.541291256" watchObservedRunningTime="2025-11-29 01:34:40.381872577 +0000 UTC m=+1423.554022444" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.706184 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.842080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8sw7\" (UniqueName: \"kubernetes.io/projected/e6d35e2c-828c-4e65-bd18-117a1b053783-kube-api-access-z8sw7\") pod \"e6d35e2c-828c-4e65-bd18-117a1b053783\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.842235 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-scripts\") pod \"e6d35e2c-828c-4e65-bd18-117a1b053783\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.842284 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-combined-ca-bundle\") pod \"e6d35e2c-828c-4e65-bd18-117a1b053783\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.842309 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-config-data\") pod \"e6d35e2c-828c-4e65-bd18-117a1b053783\" (UID: \"e6d35e2c-828c-4e65-bd18-117a1b053783\") " Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.853608 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d35e2c-828c-4e65-bd18-117a1b053783-kube-api-access-z8sw7" (OuterVolumeSpecName: "kube-api-access-z8sw7") pod "e6d35e2c-828c-4e65-bd18-117a1b053783" (UID: "e6d35e2c-828c-4e65-bd18-117a1b053783"). InnerVolumeSpecName "kube-api-access-z8sw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.858894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-scripts" (OuterVolumeSpecName: "scripts") pod "e6d35e2c-828c-4e65-bd18-117a1b053783" (UID: "e6d35e2c-828c-4e65-bd18-117a1b053783"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.885715 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6d35e2c-828c-4e65-bd18-117a1b053783" (UID: "e6d35e2c-828c-4e65-bd18-117a1b053783"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.886334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-config-data" (OuterVolumeSpecName: "config-data") pod "e6d35e2c-828c-4e65-bd18-117a1b053783" (UID: "e6d35e2c-828c-4e65-bd18-117a1b053783"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.944846 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8sw7\" (UniqueName: \"kubernetes.io/projected/e6d35e2c-828c-4e65-bd18-117a1b053783-kube-api-access-z8sw7\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.944886 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.944903 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:40 crc kubenswrapper[4749]: I1129 01:34:40.944919 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d35e2c-828c-4e65-bd18-117a1b053783-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.308994 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.309421 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-8f2xf" event={"ID":"e6d35e2c-828c-4e65-bd18-117a1b053783","Type":"ContainerDied","Data":"f7983c46a9bb790e82a5ef863da91d5c59edc804b962ef5a52622e28b87fd329"} Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.309442 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7983c46a9bb790e82a5ef863da91d5c59edc804b962ef5a52622e28b87fd329" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.395214 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 01:34:41 crc kubenswrapper[4749]: E1129 01:34:41.395774 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d35e2c-828c-4e65-bd18-117a1b053783" containerName="nova-cell0-conductor-db-sync" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.395797 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d35e2c-828c-4e65-bd18-117a1b053783" containerName="nova-cell0-conductor-db-sync" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.396096 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d35e2c-828c-4e65-bd18-117a1b053783" containerName="nova-cell0-conductor-db-sync" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.396903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.400900 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.400969 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2xnnz" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.429449 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.453628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdzzn\" (UniqueName: \"kubernetes.io/projected/26a4b5e6-f82a-4316-a7e8-d596136086c2-kube-api-access-vdzzn\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.453989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.454186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.556252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdzzn\" (UniqueName: \"kubernetes.io/projected/26a4b5e6-f82a-4316-a7e8-d596136086c2-kube-api-access-vdzzn\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.556370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.556417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.560451 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.562373 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.576394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdzzn\" (UniqueName: \"kubernetes.io/projected/26a4b5e6-f82a-4316-a7e8-d596136086c2-kube-api-access-vdzzn\") pod \"nova-cell0-conductor-0\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:41 crc kubenswrapper[4749]: I1129 01:34:41.719311 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:42 crc kubenswrapper[4749]: W1129 01:34:42.267488 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26a4b5e6_f82a_4316_a7e8_d596136086c2.slice/crio-f2218f87fab37470c221217079b5947f5bfbf515b4dc7243a51a1e40c0ceab0d WatchSource:0}: Error finding container f2218f87fab37470c221217079b5947f5bfbf515b4dc7243a51a1e40c0ceab0d: Status 404 returned error can't find the container with id f2218f87fab37470c221217079b5947f5bfbf515b4dc7243a51a1e40c0ceab0d Nov 29 01:34:42 crc kubenswrapper[4749]: I1129 01:34:42.268740 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 01:34:42 crc kubenswrapper[4749]: I1129 01:34:42.345237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"26a4b5e6-f82a-4316-a7e8-d596136086c2","Type":"ContainerStarted","Data":"f2218f87fab37470c221217079b5947f5bfbf515b4dc7243a51a1e40c0ceab0d"} Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.165476 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k5ws6"] Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.168054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.192045 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k5ws6"] Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.299743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-utilities\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.299853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5qg9\" (UniqueName: \"kubernetes.io/projected/3a05a7be-7b1f-4883-9e71-df78caa6c977-kube-api-access-d5qg9\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.299933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-catalog-content\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.355864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"26a4b5e6-f82a-4316-a7e8-d596136086c2","Type":"ContainerStarted","Data":"b451af15ca02cf0e5e8a766761169627777887698c277c719a9b70488450c040"} Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.356091 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.381275 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.381251071 podStartE2EDuration="2.381251071s" podCreationTimestamp="2025-11-29 01:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:43.373266483 +0000 UTC m=+1426.545416350" watchObservedRunningTime="2025-11-29 01:34:43.381251071 +0000 UTC m=+1426.553400928" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.402128 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5qg9\" (UniqueName: \"kubernetes.io/projected/3a05a7be-7b1f-4883-9e71-df78caa6c977-kube-api-access-d5qg9\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.402289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-catalog-content\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.403374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-catalog-content\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.403561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-utilities\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.403933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-utilities\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.423368 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5qg9\" (UniqueName: \"kubernetes.io/projected/3a05a7be-7b1f-4883-9e71-df78caa6c977-kube-api-access-d5qg9\") pod \"redhat-operators-k5ws6\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.499974 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:43 crc kubenswrapper[4749]: I1129 01:34:43.999005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k5ws6"] Nov 29 01:34:44 crc kubenswrapper[4749]: I1129 01:34:44.367235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5ws6" event={"ID":"3a05a7be-7b1f-4883-9e71-df78caa6c977","Type":"ContainerStarted","Data":"0be66b6fbb1b3a2ac2577f7f69766a2eb94cc6e777d3e7d57fdfb9fd1f2a5620"} Nov 29 01:34:44 crc kubenswrapper[4749]: I1129 01:34:44.884023 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 01:34:44 crc kubenswrapper[4749]: I1129 01:34:44.884082 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 01:34:44 crc kubenswrapper[4749]: I1129 01:34:44.936039 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 01:34:44 crc kubenswrapper[4749]: I1129 01:34:44.940930 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 01:34:45 crc kubenswrapper[4749]: I1129 01:34:45.379701 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerID="a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a" exitCode=0 Nov 29 01:34:45 crc kubenswrapper[4749]: I1129 01:34:45.379838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5ws6" event={"ID":"3a05a7be-7b1f-4883-9e71-df78caa6c977","Type":"ContainerDied","Data":"a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a"} Nov 29 01:34:45 crc kubenswrapper[4749]: I1129 01:34:45.380526 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 01:34:45 crc kubenswrapper[4749]: I1129 01:34:45.380572 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 01:34:45 crc kubenswrapper[4749]: I1129 01:34:45.675305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:45 crc kubenswrapper[4749]: I1129 01:34:45.677057 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:45 crc kubenswrapper[4749]: I1129 01:34:45.757585 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:46 crc kubenswrapper[4749]: I1129 01:34:46.399498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5ws6" event={"ID":"3a05a7be-7b1f-4883-9e71-df78caa6c977","Type":"ContainerStarted","Data":"f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465"} Nov 29 01:34:46 crc kubenswrapper[4749]: I1129 01:34:46.495541 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:47 crc kubenswrapper[4749]: I1129 01:34:47.286323 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 01:34:47 crc kubenswrapper[4749]: I1129 01:34:47.296694 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:47 crc kubenswrapper[4749]: I1129 01:34:47.296775 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:47 crc kubenswrapper[4749]: I1129 01:34:47.344948 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:47 crc kubenswrapper[4749]: I1129 01:34:47.347594 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 01:34:47 crc kubenswrapper[4749]: I1129 01:34:47.367908 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:47 crc kubenswrapper[4749]: I1129 01:34:47.418050 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:47 crc kubenswrapper[4749]: I1129 01:34:47.420629 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:48 crc kubenswrapper[4749]: I1129 01:34:48.122137 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rss4"] Nov 29 01:34:48 crc kubenswrapper[4749]: I1129 01:34:48.428282 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerID="f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465" exitCode=0 Nov 29 01:34:48 crc kubenswrapper[4749]: I1129 01:34:48.428335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5ws6" event={"ID":"3a05a7be-7b1f-4883-9e71-df78caa6c977","Type":"ContainerDied","Data":"f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465"} Nov 29 01:34:49 crc kubenswrapper[4749]: I1129 01:34:49.367674 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:49 crc kubenswrapper[4749]: I1129 01:34:49.368992 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 01:34:49 crc kubenswrapper[4749]: I1129 01:34:49.437813 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4rss4" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerName="registry-server" containerID="cri-o://1fd08de4ea6651ede49c8cca90845b37862495f840aa6ba8adc4e63fe642cd3b" gracePeriod=2 Nov 29 01:34:50 crc kubenswrapper[4749]: I1129 01:34:50.453361 4749 generic.go:334] "Generic (PLEG): container finished" podID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerID="1fd08de4ea6651ede49c8cca90845b37862495f840aa6ba8adc4e63fe642cd3b" exitCode=0 Nov 29 01:34:50 crc kubenswrapper[4749]: I1129 01:34:50.453464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rss4" event={"ID":"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2","Type":"ContainerDied","Data":"1fd08de4ea6651ede49c8cca90845b37862495f840aa6ba8adc4e63fe642cd3b"} Nov 29 01:34:50 crc kubenswrapper[4749]: I1129 01:34:50.923729 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.068307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-catalog-content\") pod \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.068392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjp54\" (UniqueName: \"kubernetes.io/projected/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-kube-api-access-cjp54\") pod \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.068532 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-utilities\") pod \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\" (UID: \"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2\") " Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.069340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-utilities" (OuterVolumeSpecName: "utilities") pod "d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" (UID: "d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.092166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-kube-api-access-cjp54" (OuterVolumeSpecName: "kube-api-access-cjp54") pod "d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" (UID: "d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2"). InnerVolumeSpecName "kube-api-access-cjp54". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.143868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" (UID: "d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.171467 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.171534 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjp54\" (UniqueName: \"kubernetes.io/projected/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-kube-api-access-cjp54\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.171556 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.479007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rss4" event={"ID":"d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2","Type":"ContainerDied","Data":"5e215f34852aa3e80eafa44101166c3ebe73088189e68d09d288b502436cf043"} Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.479091 4749 scope.go:117] "RemoveContainer" containerID="1fd08de4ea6651ede49c8cca90845b37862495f840aa6ba8adc4e63fe642cd3b" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.479034 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rss4" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.487213 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5ws6" event={"ID":"3a05a7be-7b1f-4883-9e71-df78caa6c977","Type":"ContainerStarted","Data":"75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217"} Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.534006 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k5ws6" podStartSLOduration=3.648149098 podStartE2EDuration="8.533956427s" podCreationTimestamp="2025-11-29 01:34:43 +0000 UTC" firstStartedPulling="2025-11-29 01:34:45.382083576 +0000 UTC m=+1428.554233473" lastFinishedPulling="2025-11-29 01:34:50.267890945 +0000 UTC m=+1433.440040802" observedRunningTime="2025-11-29 01:34:51.508767312 +0000 UTC m=+1434.680917179" watchObservedRunningTime="2025-11-29 01:34:51.533956427 +0000 UTC m=+1434.706106314" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.535374 4749 scope.go:117] "RemoveContainer" containerID="a02f7bc95a94374364e88a78dfb2819c6c1a934b19ec3d15f5c35a0480132105" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.550337 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rss4"] Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.558402 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4rss4"] Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.560147 4749 scope.go:117] "RemoveContainer" containerID="e3335477f9348fd3750c703649f4ae9adbf222a3324d7ee206726e935ed5ff85" Nov 29 01:34:51 crc kubenswrapper[4749]: I1129 01:34:51.753555 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.294772 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-hdhbv"] Nov 29 01:34:52 crc kubenswrapper[4749]: E1129 01:34:52.295187 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerName="registry-server" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.295214 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerName="registry-server" Nov 29 01:34:52 crc kubenswrapper[4749]: E1129 01:34:52.295235 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerName="extract-content" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.295241 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerName="extract-content" Nov 29 01:34:52 crc kubenswrapper[4749]: E1129 01:34:52.295254 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerName="extract-utilities" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.295262 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerName="extract-utilities" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.295449 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" containerName="registry-server" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.296038 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.298706 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.300463 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.309721 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hdhbv"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.397479 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-scripts\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.397708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkp8v\" (UniqueName: \"kubernetes.io/projected/38de4230-2953-456e-86a4-9c6a837a9592-kube-api-access-tkp8v\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.397749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.397816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-config-data\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.505343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-scripts\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.505462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkp8v\" (UniqueName: \"kubernetes.io/projected/38de4230-2953-456e-86a4-9c6a837a9592-kube-api-access-tkp8v\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.505487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.505523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-config-data\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.530370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.532850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-config-data\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.536592 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-scripts\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.542382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkp8v\" (UniqueName: \"kubernetes.io/projected/38de4230-2953-456e-86a4-9c6a837a9592-kube-api-access-tkp8v\") pod \"nova-cell0-cell-mapping-hdhbv\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.630790 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.632183 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.632972 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.638032 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.653259 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.654896 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.666402 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.684347 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.708914 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.757307 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.757363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rbf\" (UniqueName: \"kubernetes.io/projected/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-kube-api-access-b5rbf\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.757403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-config-data\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.758160 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.759647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.762267 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.762468 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.815152 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.816874 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.837545 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.838112 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.860443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.860534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.860567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rbf\" (UniqueName: \"kubernetes.io/projected/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-kube-api-access-b5rbf\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.860594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-config-data\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.860622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-config-data\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.860705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e34a6bca-7388-4792-81c3-a181558a168f-logs\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.860747 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n54zx\" (UniqueName: \"kubernetes.io/projected/e34a6bca-7388-4792-81c3-a181558a168f-kube-api-access-n54zx\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.878830 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-config-data\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.881914 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.909564 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rbf\" (UniqueName: \"kubernetes.io/projected/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-kube-api-access-b5rbf\") pod \"nova-scheduler-0\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " pod="openstack/nova-scheduler-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.949591 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-g5bnn"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.950892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.964453 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-g5bnn"] Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-config-data\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-config-data\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966287 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e34a6bca-7388-4792-81c3-a181558a168f-logs\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n54zx\" (UniqueName: \"kubernetes.io/projected/e34a6bca-7388-4792-81c3-a181558a168f-kube-api-access-n54zx\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966440 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dl98\" (UniqueName: \"kubernetes.io/projected/b9de2381-be42-4964-a9dd-f98253684b5e-kube-api-access-7dl98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966460 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b20884-9892-45be-9e98-61d3d70dd441-logs\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966483 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnz6r\" (UniqueName: \"kubernetes.io/projected/99b20884-9892-45be-9e98-61d3d70dd441-kube-api-access-rnz6r\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.966551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.974235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e34a6bca-7388-4792-81c3-a181558a168f-logs\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.980611 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:52 crc kubenswrapper[4749]: I1129 01:34:52.997560 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-config-data\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.015623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n54zx\" (UniqueName: \"kubernetes.io/projected/e34a6bca-7388-4792-81c3-a181558a168f-kube-api-access-n54zx\") pod \"nova-api-0\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " pod="openstack/nova-api-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-config-data\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-config\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dl98\" (UniqueName: \"kubernetes.io/projected/b9de2381-be42-4964-a9dd-f98253684b5e-kube-api-access-7dl98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b20884-9892-45be-9e98-61d3d70dd441-logs\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnz6r\" (UniqueName: \"kubernetes.io/projected/99b20884-9892-45be-9e98-61d3d70dd441-kube-api-access-rnz6r\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58t2t\" (UniqueName: \"kubernetes.io/projected/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-kube-api-access-58t2t\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.077722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.078153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b20884-9892-45be-9e98-61d3d70dd441-logs\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.083993 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.084880 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.093879 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.101836 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-config-data\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.100260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.105076 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dl98\" (UniqueName: \"kubernetes.io/projected/b9de2381-be42-4964-a9dd-f98253684b5e-kube-api-access-7dl98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.109982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnz6r\" (UniqueName: \"kubernetes.io/projected/99b20884-9892-45be-9e98-61d3d70dd441-kube-api-access-rnz6r\") pod \"nova-metadata-0\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.127984 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2" path="/var/lib/kubelet/pods/d2a2ec6c-3fed-4f0b-b873-d2feccbd31e2/volumes" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.182985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.183047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.183102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58t2t\" (UniqueName: \"kubernetes.io/projected/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-kube-api-access-58t2t\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.183178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.183246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.183344 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.183387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-config\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.184923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-svc\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.186121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.186166 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-config\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.186610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.203272 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58t2t\" (UniqueName: \"kubernetes.io/projected/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-kube-api-access-58t2t\") pod \"dnsmasq-dns-757b4f8459-g5bnn\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.280783 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.308736 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.357589 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.387072 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.394963 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hdhbv"] Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.404794 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.500953 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.505472 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.589576 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce","Type":"ContainerStarted","Data":"e3b6039e068a734d795891f03de2d7865095e4a119ced0d6e05801d5aff3669c"} Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.603992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hdhbv" event={"ID":"38de4230-2953-456e-86a4-9c6a837a9592","Type":"ContainerStarted","Data":"d1a6b7002d7e9d9bf31a073fb692ba5ab711d41c3283b5bfbcd3b4d6677b9e19"} Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.773777 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f6pkx"] Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.775218 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.783467 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.783505 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.788015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f6pkx"] Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.807320 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqr9m\" (UniqueName: \"kubernetes.io/projected/b724452e-7b4b-4a1b-af78-754fee94a0b5-kube-api-access-hqr9m\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.807380 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.807504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-scripts\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.807766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-config-data\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.909377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-config-data\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.909754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.909782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqr9m\" (UniqueName: \"kubernetes.io/projected/b724452e-7b4b-4a1b-af78-754fee94a0b5-kube-api-access-hqr9m\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.909840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-scripts\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.916899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-config-data\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.917177 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.934509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqr9m\" (UniqueName: \"kubernetes.io/projected/b724452e-7b4b-4a1b-af78-754fee94a0b5-kube-api-access-hqr9m\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.936174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-scripts\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:53 crc kubenswrapper[4749]: I1129 01:34:53.950800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-f6pkx\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.015725 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.043102 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-g5bnn"] Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.098166 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.179149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:34:54 crc kubenswrapper[4749]: W1129 01:34:54.188599 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9de2381_be42_4964_a9dd_f98253684b5e.slice/crio-02ecc5e18ac1260af28f776613b5e2f70c47fd32091cab21f3071bbb04a4c655 WatchSource:0}: Error finding container 02ecc5e18ac1260af28f776613b5e2f70c47fd32091cab21f3071bbb04a4c655: Status 404 returned error can't find the container with id 02ecc5e18ac1260af28f776613b5e2f70c47fd32091cab21f3071bbb04a4c655 Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.587884 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k5ws6" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="registry-server" probeResult="failure" output=< Nov 29 01:34:54 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 01:34:54 crc kubenswrapper[4749]: > Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.611955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9de2381-be42-4964-a9dd-f98253684b5e","Type":"ContainerStarted","Data":"02ecc5e18ac1260af28f776613b5e2f70c47fd32091cab21f3071bbb04a4c655"} Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.613173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e34a6bca-7388-4792-81c3-a181558a168f","Type":"ContainerStarted","Data":"cc934507f2f648ee421132b7fb34a983cfe6b48a9e728960dcf63ae218c8a7fd"} Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.620591 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99b20884-9892-45be-9e98-61d3d70dd441","Type":"ContainerStarted","Data":"5d2e00bf6da7daca33eb959a2cd95edf35d9c6816099ea5aaa1d598075f9b03c"} Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.628525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hdhbv" event={"ID":"38de4230-2953-456e-86a4-9c6a837a9592","Type":"ContainerStarted","Data":"82990d2dcf0c8a627fe2e51a0ec691a173345be20d660bbf2193506a1de812d5"} Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.631954 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerID="5144de4befe6c5a404315ac3261b56ae92db97d58334720fc31752c8b0925912" exitCode=0 Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.631994 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" event={"ID":"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d","Type":"ContainerDied","Data":"5144de4befe6c5a404315ac3261b56ae92db97d58334720fc31752c8b0925912"} Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.632011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" event={"ID":"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d","Type":"ContainerStarted","Data":"9475d37a13b715892df79770e9c1221c607faed7b71b2dc0c09f8bf0891f0f85"} Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.632701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f6pkx"] Nov 29 01:34:54 crc kubenswrapper[4749]: I1129 01:34:54.659254 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-hdhbv" podStartSLOduration=2.65923789 podStartE2EDuration="2.65923789s" podCreationTimestamp="2025-11-29 01:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:54.65799892 +0000 UTC m=+1437.830148797" watchObservedRunningTime="2025-11-29 01:34:54.65923789 +0000 UTC m=+1437.831387747" Nov 29 01:34:55 crc kubenswrapper[4749]: I1129 01:34:55.643145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" event={"ID":"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d","Type":"ContainerStarted","Data":"fe391866b222bdc27b02c092a42376ca48bf580b3de4ed28527aad1c47cfa12b"} Nov 29 01:34:55 crc kubenswrapper[4749]: I1129 01:34:55.644612 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:34:55 crc kubenswrapper[4749]: I1129 01:34:55.647275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" event={"ID":"b724452e-7b4b-4a1b-af78-754fee94a0b5","Type":"ContainerStarted","Data":"057b70f2300d8a9d4f6146699c4726942e6bf05efd91b5b8e7130fb0cdcd5547"} Nov 29 01:34:55 crc kubenswrapper[4749]: I1129 01:34:55.647310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" event={"ID":"b724452e-7b4b-4a1b-af78-754fee94a0b5","Type":"ContainerStarted","Data":"7626659c66b6593cd3d7f678ee79caba8b553f48c2ff22a8511750e51d150129"} Nov 29 01:34:55 crc kubenswrapper[4749]: I1129 01:34:55.663025 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" podStartSLOduration=3.663012996 podStartE2EDuration="3.663012996s" podCreationTimestamp="2025-11-29 01:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:55.659634412 +0000 UTC m=+1438.831784299" watchObservedRunningTime="2025-11-29 01:34:55.663012996 +0000 UTC m=+1438.835162853" Nov 29 01:34:55 crc kubenswrapper[4749]: I1129 01:34:55.682483 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" podStartSLOduration=2.682466348 podStartE2EDuration="2.682466348s" podCreationTimestamp="2025-11-29 01:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:34:55.677501235 +0000 UTC m=+1438.849651112" watchObservedRunningTime="2025-11-29 01:34:55.682466348 +0000 UTC m=+1438.854616205" Nov 29 01:34:56 crc kubenswrapper[4749]: I1129 01:34:56.250782 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:34:56 crc kubenswrapper[4749]: I1129 01:34:56.262869 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.214524 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h5vsn"] Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.221157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.222775 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h5vsn"] Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.295221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-catalog-content\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.295330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqrzv\" (UniqueName: \"kubernetes.io/projected/5670797c-5dde-4198-a730-5e5957f3d7d7-kube-api-access-zqrzv\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.295540 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-utilities\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.397269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-utilities\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.397363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-catalog-content\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.397405 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqrzv\" (UniqueName: \"kubernetes.io/projected/5670797c-5dde-4198-a730-5e5957f3d7d7-kube-api-access-zqrzv\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.398051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-utilities\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.398279 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-catalog-content\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.417951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqrzv\" (UniqueName: \"kubernetes.io/projected/5670797c-5dde-4198-a730-5e5957f3d7d7-kube-api-access-zqrzv\") pod \"certified-operators-h5vsn\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:57 crc kubenswrapper[4749]: I1129 01:34:57.586537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.313399 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h5vsn"] Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.676759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99b20884-9892-45be-9e98-61d3d70dd441","Type":"ContainerStarted","Data":"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464"} Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.677025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99b20884-9892-45be-9e98-61d3d70dd441","Type":"ContainerStarted","Data":"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc"} Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.676880 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="99b20884-9892-45be-9e98-61d3d70dd441" containerName="nova-metadata-metadata" containerID="cri-o://fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464" gracePeriod=30 Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.676859 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="99b20884-9892-45be-9e98-61d3d70dd441" containerName="nova-metadata-log" containerID="cri-o://ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc" gracePeriod=30 Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.679740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9de2381-be42-4964-a9dd-f98253684b5e","Type":"ContainerStarted","Data":"81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532"} Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.679850 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b9de2381-be42-4964-a9dd-f98253684b5e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532" gracePeriod=30 Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.687782 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e34a6bca-7388-4792-81c3-a181558a168f","Type":"ContainerStarted","Data":"92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2"} Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.687829 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e34a6bca-7388-4792-81c3-a181558a168f","Type":"ContainerStarted","Data":"2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57"} Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.690701 4749 generic.go:334] "Generic (PLEG): container finished" podID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerID="736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6" exitCode=0 Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.690777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5vsn" event={"ID":"5670797c-5dde-4198-a730-5e5957f3d7d7","Type":"ContainerDied","Data":"736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6"} Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.690794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5vsn" event={"ID":"5670797c-5dde-4198-a730-5e5957f3d7d7","Type":"ContainerStarted","Data":"91d0d2d9a62f4a3c8550761291536c6ece4bdac21442e60a9fdc667a5235d411"} Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.693577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce","Type":"ContainerStarted","Data":"c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399"} Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.703169 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.935745349 podStartE2EDuration="6.703152039s" podCreationTimestamp="2025-11-29 01:34:52 +0000 UTC" firstStartedPulling="2025-11-29 01:34:53.926590909 +0000 UTC m=+1437.098740766" lastFinishedPulling="2025-11-29 01:34:57.693997599 +0000 UTC m=+1440.866147456" observedRunningTime="2025-11-29 01:34:58.696105714 +0000 UTC m=+1441.868255571" watchObservedRunningTime="2025-11-29 01:34:58.703152039 +0000 UTC m=+1441.875301896" Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.714401 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.437094811 podStartE2EDuration="6.714384217s" podCreationTimestamp="2025-11-29 01:34:52 +0000 UTC" firstStartedPulling="2025-11-29 01:34:53.415830781 +0000 UTC m=+1436.587980638" lastFinishedPulling="2025-11-29 01:34:57.693120187 +0000 UTC m=+1440.865270044" observedRunningTime="2025-11-29 01:34:58.709490996 +0000 UTC m=+1441.881640863" watchObservedRunningTime="2025-11-29 01:34:58.714384217 +0000 UTC m=+1441.886534074" Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.751497 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.099850418 podStartE2EDuration="6.751478987s" podCreationTimestamp="2025-11-29 01:34:52 +0000 UTC" firstStartedPulling="2025-11-29 01:34:54.04195914 +0000 UTC m=+1437.214108987" lastFinishedPulling="2025-11-29 01:34:57.693587699 +0000 UTC m=+1440.865737556" observedRunningTime="2025-11-29 01:34:58.750114083 +0000 UTC m=+1441.922263940" watchObservedRunningTime="2025-11-29 01:34:58.751478987 +0000 UTC m=+1441.923628864" Nov 29 01:34:58 crc kubenswrapper[4749]: I1129 01:34:58.771998 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.241854641 podStartE2EDuration="6.771980626s" podCreationTimestamp="2025-11-29 01:34:52 +0000 UTC" firstStartedPulling="2025-11-29 01:34:54.20039893 +0000 UTC m=+1437.372548787" lastFinishedPulling="2025-11-29 01:34:57.730524915 +0000 UTC m=+1440.902674772" observedRunningTime="2025-11-29 01:34:58.768303364 +0000 UTC m=+1441.940453221" watchObservedRunningTime="2025-11-29 01:34:58.771980626 +0000 UTC m=+1441.944130483" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.256646 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.336552 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-config-data\") pod \"99b20884-9892-45be-9e98-61d3d70dd441\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.336745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-combined-ca-bundle\") pod \"99b20884-9892-45be-9e98-61d3d70dd441\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.336812 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b20884-9892-45be-9e98-61d3d70dd441-logs\") pod \"99b20884-9892-45be-9e98-61d3d70dd441\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.336926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnz6r\" (UniqueName: \"kubernetes.io/projected/99b20884-9892-45be-9e98-61d3d70dd441-kube-api-access-rnz6r\") pod \"99b20884-9892-45be-9e98-61d3d70dd441\" (UID: \"99b20884-9892-45be-9e98-61d3d70dd441\") " Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.337808 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99b20884-9892-45be-9e98-61d3d70dd441-logs" (OuterVolumeSpecName: "logs") pod "99b20884-9892-45be-9e98-61d3d70dd441" (UID: "99b20884-9892-45be-9e98-61d3d70dd441"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.342270 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99b20884-9892-45be-9e98-61d3d70dd441-kube-api-access-rnz6r" (OuterVolumeSpecName: "kube-api-access-rnz6r") pod "99b20884-9892-45be-9e98-61d3d70dd441" (UID: "99b20884-9892-45be-9e98-61d3d70dd441"). InnerVolumeSpecName "kube-api-access-rnz6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.371834 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-config-data" (OuterVolumeSpecName: "config-data") pod "99b20884-9892-45be-9e98-61d3d70dd441" (UID: "99b20884-9892-45be-9e98-61d3d70dd441"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.374702 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99b20884-9892-45be-9e98-61d3d70dd441" (UID: "99b20884-9892-45be-9e98-61d3d70dd441"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.439740 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.439777 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b20884-9892-45be-9e98-61d3d70dd441-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.439791 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnz6r\" (UniqueName: \"kubernetes.io/projected/99b20884-9892-45be-9e98-61d3d70dd441-kube-api-access-rnz6r\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.439804 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b20884-9892-45be-9e98-61d3d70dd441-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.704907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5vsn" event={"ID":"5670797c-5dde-4198-a730-5e5957f3d7d7","Type":"ContainerStarted","Data":"930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a"} Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.711918 4749 generic.go:334] "Generic (PLEG): container finished" podID="99b20884-9892-45be-9e98-61d3d70dd441" containerID="fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464" exitCode=0 Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.711955 4749 generic.go:334] "Generic (PLEG): container finished" podID="99b20884-9892-45be-9e98-61d3d70dd441" containerID="ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc" exitCode=143 Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.712607 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.713819 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99b20884-9892-45be-9e98-61d3d70dd441","Type":"ContainerDied","Data":"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464"} Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.714484 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99b20884-9892-45be-9e98-61d3d70dd441","Type":"ContainerDied","Data":"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc"} Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.714542 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"99b20884-9892-45be-9e98-61d3d70dd441","Type":"ContainerDied","Data":"5d2e00bf6da7daca33eb959a2cd95edf35d9c6816099ea5aaa1d598075f9b03c"} Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.714575 4749 scope.go:117] "RemoveContainer" containerID="fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.755857 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.771444 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.790938 4749 scope.go:117] "RemoveContainer" containerID="ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.796537 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:34:59 crc kubenswrapper[4749]: E1129 01:34:59.797018 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b20884-9892-45be-9e98-61d3d70dd441" containerName="nova-metadata-metadata" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.797039 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b20884-9892-45be-9e98-61d3d70dd441" containerName="nova-metadata-metadata" Nov 29 01:34:59 crc kubenswrapper[4749]: E1129 01:34:59.797049 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99b20884-9892-45be-9e98-61d3d70dd441" containerName="nova-metadata-log" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.797056 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99b20884-9892-45be-9e98-61d3d70dd441" containerName="nova-metadata-log" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.797246 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b20884-9892-45be-9e98-61d3d70dd441" containerName="nova-metadata-log" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.797268 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99b20884-9892-45be-9e98-61d3d70dd441" containerName="nova-metadata-metadata" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.798298 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.800787 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.800950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.821501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.828429 4749 scope.go:117] "RemoveContainer" containerID="fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464" Nov 29 01:34:59 crc kubenswrapper[4749]: E1129 01:34:59.829019 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464\": container with ID starting with fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464 not found: ID does not exist" containerID="fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.829062 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464"} err="failed to get container status \"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464\": rpc error: code = NotFound desc = could not find container \"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464\": container with ID starting with fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464 not found: ID does not exist" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.829088 4749 scope.go:117] "RemoveContainer" containerID="ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc" Nov 29 01:34:59 crc kubenswrapper[4749]: E1129 01:34:59.830067 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc\": container with ID starting with ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc not found: ID does not exist" containerID="ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.830086 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc"} err="failed to get container status \"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc\": rpc error: code = NotFound desc = could not find container \"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc\": container with ID starting with ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc not found: ID does not exist" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.830100 4749 scope.go:117] "RemoveContainer" containerID="fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.830478 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464"} err="failed to get container status \"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464\": rpc error: code = NotFound desc = could not find container \"fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464\": container with ID starting with fea5c692e1e31f211f0cc4f5ca8747d99fa022083f54545d217d70d10e106464 not found: ID does not exist" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.830491 4749 scope.go:117] "RemoveContainer" containerID="ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.830685 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc"} err="failed to get container status \"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc\": rpc error: code = NotFound desc = could not find container \"ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc\": container with ID starting with ef9800c7f8c8bfd5f468fe7deef008dc746e05268a18cf3862032d71608440fc not found: ID does not exist" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.850364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h44m\" (UniqueName: \"kubernetes.io/projected/abb83ebb-828e-4761-9847-e49672d30be4-kube-api-access-7h44m\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.850414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.850444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb83ebb-828e-4761-9847-e49672d30be4-logs\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.850514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.850557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-config-data\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.951880 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.951964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-config-data\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.952028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h44m\" (UniqueName: \"kubernetes.io/projected/abb83ebb-828e-4761-9847-e49672d30be4-kube-api-access-7h44m\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.952054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.952076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb83ebb-828e-4761-9847-e49672d30be4-logs\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.952524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb83ebb-828e-4761-9847-e49672d30be4-logs\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.957595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.957762 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-config-data\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.958512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:34:59 crc kubenswrapper[4749]: I1129 01:34:59.981983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h44m\" (UniqueName: \"kubernetes.io/projected/abb83ebb-828e-4761-9847-e49672d30be4-kube-api-access-7h44m\") pod \"nova-metadata-0\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " pod="openstack/nova-metadata-0" Nov 29 01:35:00 crc kubenswrapper[4749]: I1129 01:35:00.145066 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:00 crc kubenswrapper[4749]: I1129 01:35:00.683649 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:00 crc kubenswrapper[4749]: I1129 01:35:00.722452 4749 generic.go:334] "Generic (PLEG): container finished" podID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerID="930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a" exitCode=0 Nov 29 01:35:00 crc kubenswrapper[4749]: I1129 01:35:00.722539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5vsn" event={"ID":"5670797c-5dde-4198-a730-5e5957f3d7d7","Type":"ContainerDied","Data":"930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a"} Nov 29 01:35:00 crc kubenswrapper[4749]: I1129 01:35:00.724593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abb83ebb-828e-4761-9847-e49672d30be4","Type":"ContainerStarted","Data":"16e855389d2f049af703322babca2ff9fac380a54c3e73872a855bd5ad503daa"} Nov 29 01:35:01 crc kubenswrapper[4749]: I1129 01:35:01.090599 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99b20884-9892-45be-9e98-61d3d70dd441" path="/var/lib/kubelet/pods/99b20884-9892-45be-9e98-61d3d70dd441/volumes" Nov 29 01:35:01 crc kubenswrapper[4749]: I1129 01:35:01.735736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abb83ebb-828e-4761-9847-e49672d30be4","Type":"ContainerStarted","Data":"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81"} Nov 29 01:35:01 crc kubenswrapper[4749]: I1129 01:35:01.736088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abb83ebb-828e-4761-9847-e49672d30be4","Type":"ContainerStarted","Data":"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f"} Nov 29 01:35:01 crc kubenswrapper[4749]: I1129 01:35:01.738175 4749 generic.go:334] "Generic (PLEG): container finished" podID="38de4230-2953-456e-86a4-9c6a837a9592" containerID="82990d2dcf0c8a627fe2e51a0ec691a173345be20d660bbf2193506a1de812d5" exitCode=0 Nov 29 01:35:01 crc kubenswrapper[4749]: I1129 01:35:01.738296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hdhbv" event={"ID":"38de4230-2953-456e-86a4-9c6a837a9592","Type":"ContainerDied","Data":"82990d2dcf0c8a627fe2e51a0ec691a173345be20d660bbf2193506a1de812d5"} Nov 29 01:35:01 crc kubenswrapper[4749]: I1129 01:35:01.741431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5vsn" event={"ID":"5670797c-5dde-4198-a730-5e5957f3d7d7","Type":"ContainerStarted","Data":"03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc"} Nov 29 01:35:01 crc kubenswrapper[4749]: I1129 01:35:01.754176 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.75416329 podStartE2EDuration="2.75416329s" podCreationTimestamp="2025-11-29 01:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:01.753461313 +0000 UTC m=+1444.925611180" watchObservedRunningTime="2025-11-29 01:35:01.75416329 +0000 UTC m=+1444.926313147" Nov 29 01:35:01 crc kubenswrapper[4749]: I1129 01:35:01.775282 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h5vsn" podStartSLOduration=2.195738596 podStartE2EDuration="4.775192432s" podCreationTimestamp="2025-11-29 01:34:57 +0000 UTC" firstStartedPulling="2025-11-29 01:34:58.693824107 +0000 UTC m=+1441.865973964" lastFinishedPulling="2025-11-29 01:35:01.273277943 +0000 UTC m=+1444.445427800" observedRunningTime="2025-11-29 01:35:01.771757497 +0000 UTC m=+1444.943907374" watchObservedRunningTime="2025-11-29 01:35:01.775192432 +0000 UTC m=+1444.947342279" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.094821 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.096733 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.128180 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.153038 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:35:03 crc kubenswrapper[4749]: E1129 01:35:03.202648 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb724452e_7b4b_4a1b_af78_754fee94a0b5.slice/crio-conmon-057b70f2300d8a9d4f6146699c4726942e6bf05efd91b5b8e7130fb0cdcd5547.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb724452e_7b4b_4a1b_af78_754fee94a0b5.slice/crio-057b70f2300d8a9d4f6146699c4726942e6bf05efd91b5b8e7130fb0cdcd5547.scope\": RecentStats: unable to find data in memory cache]" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.215767 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-config-data\") pod \"38de4230-2953-456e-86a4-9c6a837a9592\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.215968 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-combined-ca-bundle\") pod \"38de4230-2953-456e-86a4-9c6a837a9592\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.219357 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-scripts\") pod \"38de4230-2953-456e-86a4-9c6a837a9592\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.219434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkp8v\" (UniqueName: \"kubernetes.io/projected/38de4230-2953-456e-86a4-9c6a837a9592-kube-api-access-tkp8v\") pod \"38de4230-2953-456e-86a4-9c6a837a9592\" (UID: \"38de4230-2953-456e-86a4-9c6a837a9592\") " Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.229156 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38de4230-2953-456e-86a4-9c6a837a9592-kube-api-access-tkp8v" (OuterVolumeSpecName: "kube-api-access-tkp8v") pod "38de4230-2953-456e-86a4-9c6a837a9592" (UID: "38de4230-2953-456e-86a4-9c6a837a9592"). InnerVolumeSpecName "kube-api-access-tkp8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.232534 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-scripts" (OuterVolumeSpecName: "scripts") pod "38de4230-2953-456e-86a4-9c6a837a9592" (UID: "38de4230-2953-456e-86a4-9c6a837a9592"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.252901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38de4230-2953-456e-86a4-9c6a837a9592" (UID: "38de4230-2953-456e-86a4-9c6a837a9592"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.262159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-config-data" (OuterVolumeSpecName: "config-data") pod "38de4230-2953-456e-86a4-9c6a837a9592" (UID: "38de4230-2953-456e-86a4-9c6a837a9592"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.282228 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.282502 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.322735 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.322770 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.322782 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38de4230-2953-456e-86a4-9c6a837a9592-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.322790 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkp8v\" (UniqueName: \"kubernetes.io/projected/38de4230-2953-456e-86a4-9c6a837a9592-kube-api-access-tkp8v\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.358078 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.406419 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.496445 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-q4zcf"] Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.497088 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" podUID="a69dc8b1-6d28-4872-97d2-471104e468fe" containerName="dnsmasq-dns" containerID="cri-o://e5bed771bb234e6165f9372ee25f274f50c7368489100dfd707cbe3e43b234ea" gracePeriod=10 Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.764414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hdhbv" event={"ID":"38de4230-2953-456e-86a4-9c6a837a9592","Type":"ContainerDied","Data":"d1a6b7002d7e9d9bf31a073fb692ba5ab711d41c3283b5bfbcd3b4d6677b9e19"} Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.764486 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a6b7002d7e9d9bf31a073fb692ba5ab711d41c3283b5bfbcd3b4d6677b9e19" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.764584 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hdhbv" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.771319 4749 generic.go:334] "Generic (PLEG): container finished" podID="a69dc8b1-6d28-4872-97d2-471104e468fe" containerID="e5bed771bb234e6165f9372ee25f274f50c7368489100dfd707cbe3e43b234ea" exitCode=0 Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.771384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" event={"ID":"a69dc8b1-6d28-4872-97d2-471104e468fe","Type":"ContainerDied","Data":"e5bed771bb234e6165f9372ee25f274f50c7368489100dfd707cbe3e43b234ea"} Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.775649 4749 generic.go:334] "Generic (PLEG): container finished" podID="b724452e-7b4b-4a1b-af78-754fee94a0b5" containerID="057b70f2300d8a9d4f6146699c4726942e6bf05efd91b5b8e7130fb0cdcd5547" exitCode=0 Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.775727 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" event={"ID":"b724452e-7b4b-4a1b-af78-754fee94a0b5","Type":"ContainerDied","Data":"057b70f2300d8a9d4f6146699c4726942e6bf05efd91b5b8e7130fb0cdcd5547"} Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.851433 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.970923 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.971156 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-log" containerID="cri-o://2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57" gracePeriod=30 Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.971295 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-api" containerID="cri-o://92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2" gracePeriod=30 Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.975322 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": EOF" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.975403 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": EOF" Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.990852 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.991330 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="abb83ebb-828e-4761-9847-e49672d30be4" containerName="nova-metadata-log" containerID="cri-o://0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f" gracePeriod=30 Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.991403 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="abb83ebb-828e-4761-9847-e49672d30be4" containerName="nova-metadata-metadata" containerID="cri-o://f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81" gracePeriod=30 Nov 29 01:35:03 crc kubenswrapper[4749]: I1129 01:35:03.999783 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.043308 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-swift-storage-0\") pod \"a69dc8b1-6d28-4872-97d2-471104e468fe\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.043456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-config\") pod \"a69dc8b1-6d28-4872-97d2-471104e468fe\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.043646 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nptw7\" (UniqueName: \"kubernetes.io/projected/a69dc8b1-6d28-4872-97d2-471104e468fe-kube-api-access-nptw7\") pod \"a69dc8b1-6d28-4872-97d2-471104e468fe\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.043684 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-sb\") pod \"a69dc8b1-6d28-4872-97d2-471104e468fe\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.043727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-svc\") pod \"a69dc8b1-6d28-4872-97d2-471104e468fe\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.043769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-nb\") pod \"a69dc8b1-6d28-4872-97d2-471104e468fe\" (UID: \"a69dc8b1-6d28-4872-97d2-471104e468fe\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.055340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69dc8b1-6d28-4872-97d2-471104e468fe-kube-api-access-nptw7" (OuterVolumeSpecName: "kube-api-access-nptw7") pod "a69dc8b1-6d28-4872-97d2-471104e468fe" (UID: "a69dc8b1-6d28-4872-97d2-471104e468fe"). InnerVolumeSpecName "kube-api-access-nptw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.119049 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a69dc8b1-6d28-4872-97d2-471104e468fe" (UID: "a69dc8b1-6d28-4872-97d2-471104e468fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.133937 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-config" (OuterVolumeSpecName: "config") pod "a69dc8b1-6d28-4872-97d2-471104e468fe" (UID: "a69dc8b1-6d28-4872-97d2-471104e468fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.135966 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a69dc8b1-6d28-4872-97d2-471104e468fe" (UID: "a69dc8b1-6d28-4872-97d2-471104e468fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.143122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a69dc8b1-6d28-4872-97d2-471104e468fe" (UID: "a69dc8b1-6d28-4872-97d2-471104e468fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.147825 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.147860 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.147875 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nptw7\" (UniqueName: \"kubernetes.io/projected/a69dc8b1-6d28-4872-97d2-471104e468fe-kube-api-access-nptw7\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.147889 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.147900 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.152319 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a69dc8b1-6d28-4872-97d2-471104e468fe" (UID: "a69dc8b1-6d28-4872-97d2-471104e468fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.250322 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a69dc8b1-6d28-4872-97d2-471104e468fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.497984 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.569312 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k5ws6" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="registry-server" probeResult="failure" output=< Nov 29 01:35:04 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 01:35:04 crc kubenswrapper[4749]: > Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.687434 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.759909 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-config-data\") pod \"abb83ebb-828e-4761-9847-e49672d30be4\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.760004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb83ebb-828e-4761-9847-e49672d30be4-logs\") pod \"abb83ebb-828e-4761-9847-e49672d30be4\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.760120 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-combined-ca-bundle\") pod \"abb83ebb-828e-4761-9847-e49672d30be4\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.760267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-nova-metadata-tls-certs\") pod \"abb83ebb-828e-4761-9847-e49672d30be4\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.760337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h44m\" (UniqueName: \"kubernetes.io/projected/abb83ebb-828e-4761-9847-e49672d30be4-kube-api-access-7h44m\") pod \"abb83ebb-828e-4761-9847-e49672d30be4\" (UID: \"abb83ebb-828e-4761-9847-e49672d30be4\") " Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.760367 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb83ebb-828e-4761-9847-e49672d30be4-logs" (OuterVolumeSpecName: "logs") pod "abb83ebb-828e-4761-9847-e49672d30be4" (UID: "abb83ebb-828e-4761-9847-e49672d30be4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.761005 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abb83ebb-828e-4761-9847-e49672d30be4-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.763487 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb83ebb-828e-4761-9847-e49672d30be4-kube-api-access-7h44m" (OuterVolumeSpecName: "kube-api-access-7h44m") pod "abb83ebb-828e-4761-9847-e49672d30be4" (UID: "abb83ebb-828e-4761-9847-e49672d30be4"). InnerVolumeSpecName "kube-api-access-7h44m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.791544 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-config-data" (OuterVolumeSpecName: "config-data") pod "abb83ebb-828e-4761-9847-e49672d30be4" (UID: "abb83ebb-828e-4761-9847-e49672d30be4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.793326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abb83ebb-828e-4761-9847-e49672d30be4" (UID: "abb83ebb-828e-4761-9847-e49672d30be4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.794310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" event={"ID":"a69dc8b1-6d28-4872-97d2-471104e468fe","Type":"ContainerDied","Data":"371d0bdc441b70ecb5a37239f0f99e3a5330ac29913c720f62e1879093bdc7e0"} Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.794365 4749 scope.go:117] "RemoveContainer" containerID="e5bed771bb234e6165f9372ee25f274f50c7368489100dfd707cbe3e43b234ea" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.794500 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-q4zcf" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.804550 4749 generic.go:334] "Generic (PLEG): container finished" podID="e34a6bca-7388-4792-81c3-a181558a168f" containerID="2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57" exitCode=143 Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.804645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e34a6bca-7388-4792-81c3-a181558a168f","Type":"ContainerDied","Data":"2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57"} Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.807033 4749 generic.go:334] "Generic (PLEG): container finished" podID="abb83ebb-828e-4761-9847-e49672d30be4" containerID="f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81" exitCode=0 Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.807056 4749 generic.go:334] "Generic (PLEG): container finished" podID="abb83ebb-828e-4761-9847-e49672d30be4" containerID="0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f" exitCode=143 Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.807313 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abb83ebb-828e-4761-9847-e49672d30be4","Type":"ContainerDied","Data":"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81"} Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.807370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abb83ebb-828e-4761-9847-e49672d30be4","Type":"ContainerDied","Data":"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f"} Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.807389 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"abb83ebb-828e-4761-9847-e49672d30be4","Type":"ContainerDied","Data":"16e855389d2f049af703322babca2ff9fac380a54c3e73872a855bd5ad503daa"} Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.807413 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.843634 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-q4zcf"] Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.850215 4749 scope.go:117] "RemoveContainer" containerID="80245bf1375be77093f163d2d36a22b8e4b81e8417dc9026378a606ed08f2aa8" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.851312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "abb83ebb-828e-4761-9847-e49672d30be4" (UID: "abb83ebb-828e-4761-9847-e49672d30be4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.857828 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-q4zcf"] Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.864686 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.864719 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.864732 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/abb83ebb-828e-4761-9847-e49672d30be4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.864743 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h44m\" (UniqueName: \"kubernetes.io/projected/abb83ebb-828e-4761-9847-e49672d30be4-kube-api-access-7h44m\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.888104 4749 scope.go:117] "RemoveContainer" containerID="f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.906595 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 01:35:04 crc kubenswrapper[4749]: I1129 01:35:04.975018 4749 scope.go:117] "RemoveContainer" containerID="0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:04.998455 4749 scope.go:117] "RemoveContainer" containerID="f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81" Nov 29 01:35:05 crc kubenswrapper[4749]: E1129 01:35:05.001484 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81\": container with ID starting with f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81 not found: ID does not exist" containerID="f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.001532 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81"} err="failed to get container status \"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81\": rpc error: code = NotFound desc = could not find container \"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81\": container with ID starting with f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81 not found: ID does not exist" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.001562 4749 scope.go:117] "RemoveContainer" containerID="0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f" Nov 29 01:35:05 crc kubenswrapper[4749]: E1129 01:35:05.001884 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f\": container with ID starting with 0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f not found: ID does not exist" containerID="0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.001909 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f"} err="failed to get container status \"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f\": rpc error: code = NotFound desc = could not find container \"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f\": container with ID starting with 0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f not found: ID does not exist" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.001926 4749 scope.go:117] "RemoveContainer" containerID="f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.002213 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81"} err="failed to get container status \"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81\": rpc error: code = NotFound desc = could not find container \"f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81\": container with ID starting with f34d423b075e65545be7559d167dfe71e22c43e718179d26a435a29e5a948c81 not found: ID does not exist" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.002238 4749 scope.go:117] "RemoveContainer" containerID="0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.002529 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f"} err="failed to get container status \"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f\": rpc error: code = NotFound desc = could not find container \"0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f\": container with ID starting with 0cc5a2cd3beb8642bc261ab54a6a4449310c73f1597e42785cbf0f7673341d6f not found: ID does not exist" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.090546 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a69dc8b1-6d28-4872-97d2-471104e468fe" path="/var/lib/kubelet/pods/a69dc8b1-6d28-4872-97d2-471104e468fe/volumes" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.249454 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.267589 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.287627 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.323836 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:05 crc kubenswrapper[4749]: E1129 01:35:05.324420 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb83ebb-828e-4761-9847-e49672d30be4" containerName="nova-metadata-log" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324441 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb83ebb-828e-4761-9847-e49672d30be4" containerName="nova-metadata-log" Nov 29 01:35:05 crc kubenswrapper[4749]: E1129 01:35:05.324471 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b724452e-7b4b-4a1b-af78-754fee94a0b5" containerName="nova-cell1-conductor-db-sync" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324478 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b724452e-7b4b-4a1b-af78-754fee94a0b5" containerName="nova-cell1-conductor-db-sync" Nov 29 01:35:05 crc kubenswrapper[4749]: E1129 01:35:05.324494 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38de4230-2953-456e-86a4-9c6a837a9592" containerName="nova-manage" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324500 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="38de4230-2953-456e-86a4-9c6a837a9592" containerName="nova-manage" Nov 29 01:35:05 crc kubenswrapper[4749]: E1129 01:35:05.324514 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb83ebb-828e-4761-9847-e49672d30be4" containerName="nova-metadata-metadata" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324520 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb83ebb-828e-4761-9847-e49672d30be4" containerName="nova-metadata-metadata" Nov 29 01:35:05 crc kubenswrapper[4749]: E1129 01:35:05.324570 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69dc8b1-6d28-4872-97d2-471104e468fe" containerName="init" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324577 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69dc8b1-6d28-4872-97d2-471104e468fe" containerName="init" Nov 29 01:35:05 crc kubenswrapper[4749]: E1129 01:35:05.324588 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69dc8b1-6d28-4872-97d2-471104e468fe" containerName="dnsmasq-dns" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324595 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69dc8b1-6d28-4872-97d2-471104e468fe" containerName="dnsmasq-dns" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324769 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="38de4230-2953-456e-86a4-9c6a837a9592" containerName="nova-manage" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324782 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb83ebb-828e-4761-9847-e49672d30be4" containerName="nova-metadata-log" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324795 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b724452e-7b4b-4a1b-af78-754fee94a0b5" containerName="nova-cell1-conductor-db-sync" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324806 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb83ebb-828e-4761-9847-e49672d30be4" containerName="nova-metadata-metadata" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.324852 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69dc8b1-6d28-4872-97d2-471104e468fe" containerName="dnsmasq-dns" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.325950 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.330874 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.331279 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.332747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.380348 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-scripts\") pod \"b724452e-7b4b-4a1b-af78-754fee94a0b5\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.380491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqr9m\" (UniqueName: \"kubernetes.io/projected/b724452e-7b4b-4a1b-af78-754fee94a0b5-kube-api-access-hqr9m\") pod \"b724452e-7b4b-4a1b-af78-754fee94a0b5\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.380654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-combined-ca-bundle\") pod \"b724452e-7b4b-4a1b-af78-754fee94a0b5\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.380784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-config-data\") pod \"b724452e-7b4b-4a1b-af78-754fee94a0b5\" (UID: \"b724452e-7b4b-4a1b-af78-754fee94a0b5\") " Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.383628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.383725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-config-data\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.383806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7013e94a-6bc7-4cda-9577-117ca35a6024-logs\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.399068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.399645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8dm7\" (UniqueName: \"kubernetes.io/projected/7013e94a-6bc7-4cda-9577-117ca35a6024-kube-api-access-b8dm7\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.427583 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-scripts" (OuterVolumeSpecName: "scripts") pod "b724452e-7b4b-4a1b-af78-754fee94a0b5" (UID: "b724452e-7b4b-4a1b-af78-754fee94a0b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.427694 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b724452e-7b4b-4a1b-af78-754fee94a0b5-kube-api-access-hqr9m" (OuterVolumeSpecName: "kube-api-access-hqr9m") pod "b724452e-7b4b-4a1b-af78-754fee94a0b5" (UID: "b724452e-7b4b-4a1b-af78-754fee94a0b5"). InnerVolumeSpecName "kube-api-access-hqr9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.453883 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b724452e-7b4b-4a1b-af78-754fee94a0b5" (UID: "b724452e-7b4b-4a1b-af78-754fee94a0b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.476493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-config-data" (OuterVolumeSpecName: "config-data") pod "b724452e-7b4b-4a1b-af78-754fee94a0b5" (UID: "b724452e-7b4b-4a1b-af78-754fee94a0b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.501780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.501859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8dm7\" (UniqueName: \"kubernetes.io/projected/7013e94a-6bc7-4cda-9577-117ca35a6024-kube-api-access-b8dm7\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.501943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.501984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-config-data\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.502022 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7013e94a-6bc7-4cda-9577-117ca35a6024-logs\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.502088 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.502098 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.502108 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqr9m\" (UniqueName: \"kubernetes.io/projected/b724452e-7b4b-4a1b-af78-754fee94a0b5-kube-api-access-hqr9m\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.502118 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b724452e-7b4b-4a1b-af78-754fee94a0b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.502482 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7013e94a-6bc7-4cda-9577-117ca35a6024-logs\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.506641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.509026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-config-data\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.509249 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.518838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8dm7\" (UniqueName: \"kubernetes.io/projected/7013e94a-6bc7-4cda-9577-117ca35a6024-kube-api-access-b8dm7\") pod \"nova-metadata-0\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.647179 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.823698 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" event={"ID":"b724452e-7b4b-4a1b-af78-754fee94a0b5","Type":"ContainerDied","Data":"7626659c66b6593cd3d7f678ee79caba8b553f48c2ff22a8511750e51d150129"} Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.824012 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7626659c66b6593cd3d7f678ee79caba8b553f48c2ff22a8511750e51d150129" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.824089 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-f6pkx" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.843435 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" containerName="nova-scheduler-scheduler" containerID="cri-o://c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399" gracePeriod=30 Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.902959 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.904392 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.908002 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 29 01:35:05 crc kubenswrapper[4749]: I1129 01:35:05.912028 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.014086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gthg\" (UniqueName: \"kubernetes.io/projected/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-kube-api-access-9gthg\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.014214 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.014284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.116324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.116457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.116548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gthg\" (UniqueName: \"kubernetes.io/projected/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-kube-api-access-9gthg\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.121503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.123245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.140251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gthg\" (UniqueName: \"kubernetes.io/projected/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-kube-api-access-9gthg\") pod \"nova-cell1-conductor-0\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.172735 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:06 crc kubenswrapper[4749]: W1129 01:35:06.173377 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7013e94a_6bc7_4cda_9577_117ca35a6024.slice/crio-edcaf3ebfe9a4f6b7a96929edb8cc41adddd2fb684c564b11c3de65424ed0a56 WatchSource:0}: Error finding container edcaf3ebfe9a4f6b7a96929edb8cc41adddd2fb684c564b11c3de65424ed0a56: Status 404 returned error can't find the container with id edcaf3ebfe9a4f6b7a96929edb8cc41adddd2fb684c564b11c3de65424ed0a56 Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.226042 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.678939 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.854470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7013e94a-6bc7-4cda-9577-117ca35a6024","Type":"ContainerStarted","Data":"016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1"} Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.854530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7013e94a-6bc7-4cda-9577-117ca35a6024","Type":"ContainerStarted","Data":"bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058"} Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.854542 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7013e94a-6bc7-4cda-9577-117ca35a6024","Type":"ContainerStarted","Data":"edcaf3ebfe9a4f6b7a96929edb8cc41adddd2fb684c564b11c3de65424ed0a56"} Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.857269 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1bc6d8b8-4291-4f54-8bb2-508933b39c5a","Type":"ContainerStarted","Data":"328b54dca585a1b974545c1971fe79ba96509cf4f8ea37a2e72792b56929f041"} Nov 29 01:35:06 crc kubenswrapper[4749]: I1129 01:35:06.881246 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.881226012 podStartE2EDuration="1.881226012s" podCreationTimestamp="2025-11-29 01:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:06.869956843 +0000 UTC m=+1450.042106710" watchObservedRunningTime="2025-11-29 01:35:06.881226012 +0000 UTC m=+1450.053375869" Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.102241 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abb83ebb-828e-4761-9847-e49672d30be4" path="/var/lib/kubelet/pods/abb83ebb-828e-4761-9847-e49672d30be4/volumes" Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.587138 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.589064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.639588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.871533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1bc6d8b8-4291-4f54-8bb2-508933b39c5a","Type":"ContainerStarted","Data":"ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5"} Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.872699 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.923962 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.923936864 podStartE2EDuration="2.923936864s" podCreationTimestamp="2025-11-29 01:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:07.894850233 +0000 UTC m=+1451.067000110" watchObservedRunningTime="2025-11-29 01:35:07.923936864 +0000 UTC m=+1451.096086721" Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.932446 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:35:07 crc kubenswrapper[4749]: I1129 01:35:07.993751 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h5vsn"] Nov 29 01:35:08 crc kubenswrapper[4749]: E1129 01:35:08.096470 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 01:35:08 crc kubenswrapper[4749]: E1129 01:35:08.097960 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 01:35:08 crc kubenswrapper[4749]: E1129 01:35:08.099025 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 01:35:08 crc kubenswrapper[4749]: E1129 01:35:08.099088 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" containerName="nova-scheduler-scheduler" Nov 29 01:35:08 crc kubenswrapper[4749]: I1129 01:35:08.854154 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:35:08 crc kubenswrapper[4749]: I1129 01:35:08.854420 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="f479785d-0431-4aaf-88b6-ad9000996a52" containerName="kube-state-metrics" containerID="cri-o://e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0" gracePeriod=30 Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.385340 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.535403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phv28\" (UniqueName: \"kubernetes.io/projected/f479785d-0431-4aaf-88b6-ad9000996a52-kube-api-access-phv28\") pod \"f479785d-0431-4aaf-88b6-ad9000996a52\" (UID: \"f479785d-0431-4aaf-88b6-ad9000996a52\") " Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.546540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f479785d-0431-4aaf-88b6-ad9000996a52-kube-api-access-phv28" (OuterVolumeSpecName: "kube-api-access-phv28") pod "f479785d-0431-4aaf-88b6-ad9000996a52" (UID: "f479785d-0431-4aaf-88b6-ad9000996a52"). InnerVolumeSpecName "kube-api-access-phv28". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.638109 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phv28\" (UniqueName: \"kubernetes.io/projected/f479785d-0431-4aaf-88b6-ad9000996a52-kube-api-access-phv28\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.893325 4749 generic.go:334] "Generic (PLEG): container finished" podID="f479785d-0431-4aaf-88b6-ad9000996a52" containerID="e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0" exitCode=2 Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.893676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f479785d-0431-4aaf-88b6-ad9000996a52","Type":"ContainerDied","Data":"e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0"} Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.894102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f479785d-0431-4aaf-88b6-ad9000996a52","Type":"ContainerDied","Data":"80cbc49420a383a93584e75de307dda909a303a4cabdf309bbad574f8fe5b6f9"} Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.894144 4749 scope.go:117] "RemoveContainer" containerID="e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.894225 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h5vsn" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerName="registry-server" containerID="cri-o://03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc" gracePeriod=2 Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.896414 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.948506 4749 scope.go:117] "RemoveContainer" containerID="e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0" Nov 29 01:35:09 crc kubenswrapper[4749]: E1129 01:35:09.949479 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0\": container with ID starting with e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0 not found: ID does not exist" containerID="e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.949522 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0"} err="failed to get container status \"e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0\": rpc error: code = NotFound desc = could not find container \"e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0\": container with ID starting with e0f24e8ca7ccaa428351f453a87c068e5dff1ab2befb8e22b632599df8b490a0 not found: ID does not exist" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.950093 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.962733 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.977122 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:35:09 crc kubenswrapper[4749]: E1129 01:35:09.977597 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f479785d-0431-4aaf-88b6-ad9000996a52" containerName="kube-state-metrics" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.977623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f479785d-0431-4aaf-88b6-ad9000996a52" containerName="kube-state-metrics" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.977881 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f479785d-0431-4aaf-88b6-ad9000996a52" containerName="kube-state-metrics" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.978701 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.981046 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.981342 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 29 01:35:09 crc kubenswrapper[4749]: I1129 01:35:09.987054 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.151382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngssb\" (UniqueName: \"kubernetes.io/projected/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-api-access-ngssb\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.151630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.151758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.151878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.254330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.254780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.255003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngssb\" (UniqueName: \"kubernetes.io/projected/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-api-access-ngssb\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.255232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.267488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.271976 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.272772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngssb\" (UniqueName: \"kubernetes.io/projected/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-api-access-ngssb\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.276229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.405153 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.459493 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-config-data\") pod \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.459646 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-combined-ca-bundle\") pod \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.459819 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5rbf\" (UniqueName: \"kubernetes.io/projected/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-kube-api-access-b5rbf\") pod \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\" (UID: \"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.470551 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-kube-api-access-b5rbf" (OuterVolumeSpecName: "kube-api-access-b5rbf") pod "7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" (UID: "7fdeb3d7-c6b2-4065-b553-6a17c92d04ce"). InnerVolumeSpecName "kube-api-access-b5rbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.489158 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.491098 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.495142 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" (UID: "7fdeb3d7-c6b2-4065-b553-6a17c92d04ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.509500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-config-data" (OuterVolumeSpecName: "config-data") pod "7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" (UID: "7fdeb3d7-c6b2-4065-b553-6a17c92d04ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.561841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-catalog-content\") pod \"5670797c-5dde-4198-a730-5e5957f3d7d7\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.562361 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqrzv\" (UniqueName: \"kubernetes.io/projected/5670797c-5dde-4198-a730-5e5957f3d7d7-kube-api-access-zqrzv\") pod \"5670797c-5dde-4198-a730-5e5957f3d7d7\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.562544 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-utilities\") pod \"5670797c-5dde-4198-a730-5e5957f3d7d7\" (UID: \"5670797c-5dde-4198-a730-5e5957f3d7d7\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.563776 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-utilities" (OuterVolumeSpecName: "utilities") pod "5670797c-5dde-4198-a730-5e5957f3d7d7" (UID: "5670797c-5dde-4198-a730-5e5957f3d7d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.565948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5670797c-5dde-4198-a730-5e5957f3d7d7-kube-api-access-zqrzv" (OuterVolumeSpecName: "kube-api-access-zqrzv") pod "5670797c-5dde-4198-a730-5e5957f3d7d7" (UID: "5670797c-5dde-4198-a730-5e5957f3d7d7"). InnerVolumeSpecName "kube-api-access-zqrzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.567791 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqrzv\" (UniqueName: \"kubernetes.io/projected/5670797c-5dde-4198-a730-5e5957f3d7d7-kube-api-access-zqrzv\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.567808 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5rbf\" (UniqueName: \"kubernetes.io/projected/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-kube-api-access-b5rbf\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.567819 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.567827 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.567834 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.603166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5670797c-5dde-4198-a730-5e5957f3d7d7" (UID: "5670797c-5dde-4198-a730-5e5957f3d7d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.648067 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.648113 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.669918 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5670797c-5dde-4198-a730-5e5957f3d7d7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.846504 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.875309 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n54zx\" (UniqueName: \"kubernetes.io/projected/e34a6bca-7388-4792-81c3-a181558a168f-kube-api-access-n54zx\") pod \"e34a6bca-7388-4792-81c3-a181558a168f\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.875482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-combined-ca-bundle\") pod \"e34a6bca-7388-4792-81c3-a181558a168f\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.875546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-config-data\") pod \"e34a6bca-7388-4792-81c3-a181558a168f\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.875620 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e34a6bca-7388-4792-81c3-a181558a168f-logs\") pod \"e34a6bca-7388-4792-81c3-a181558a168f\" (UID: \"e34a6bca-7388-4792-81c3-a181558a168f\") " Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.876751 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34a6bca-7388-4792-81c3-a181558a168f-logs" (OuterVolumeSpecName: "logs") pod "e34a6bca-7388-4792-81c3-a181558a168f" (UID: "e34a6bca-7388-4792-81c3-a181558a168f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.879080 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34a6bca-7388-4792-81c3-a181558a168f-kube-api-access-n54zx" (OuterVolumeSpecName: "kube-api-access-n54zx") pod "e34a6bca-7388-4792-81c3-a181558a168f" (UID: "e34a6bca-7388-4792-81c3-a181558a168f"). InnerVolumeSpecName "kube-api-access-n54zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.910508 4749 generic.go:334] "Generic (PLEG): container finished" podID="e34a6bca-7388-4792-81c3-a181558a168f" containerID="92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2" exitCode=0 Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.910578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e34a6bca-7388-4792-81c3-a181558a168f","Type":"ContainerDied","Data":"92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2"} Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.910605 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e34a6bca-7388-4792-81c3-a181558a168f","Type":"ContainerDied","Data":"cc934507f2f648ee421132b7fb34a983cfe6b48a9e728960dcf63ae218c8a7fd"} Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.910622 4749 scope.go:117] "RemoveContainer" containerID="92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.910740 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.912841 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-config-data" (OuterVolumeSpecName: "config-data") pod "e34a6bca-7388-4792-81c3-a181558a168f" (UID: "e34a6bca-7388-4792-81c3-a181558a168f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.912982 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e34a6bca-7388-4792-81c3-a181558a168f" (UID: "e34a6bca-7388-4792-81c3-a181558a168f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.923964 4749 generic.go:334] "Generic (PLEG): container finished" podID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerID="03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc" exitCode=0 Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.924019 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5vsn" event={"ID":"5670797c-5dde-4198-a730-5e5957f3d7d7","Type":"ContainerDied","Data":"03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc"} Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.924044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h5vsn" event={"ID":"5670797c-5dde-4198-a730-5e5957f3d7d7","Type":"ContainerDied","Data":"91d0d2d9a62f4a3c8550761291536c6ece4bdac21442e60a9fdc667a5235d411"} Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.924105 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h5vsn" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.942118 4749 scope.go:117] "RemoveContainer" containerID="2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.944956 4749 generic.go:334] "Generic (PLEG): container finished" podID="7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" containerID="c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399" exitCode=0 Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.945002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce","Type":"ContainerDied","Data":"c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399"} Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.945040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7fdeb3d7-c6b2-4065-b553-6a17c92d04ce","Type":"ContainerDied","Data":"e3b6039e068a734d795891f03de2d7865095e4a119ced0d6e05801d5aff3669c"} Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.945036 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.971934 4749 scope.go:117] "RemoveContainer" containerID="92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2" Nov 29 01:35:10 crc kubenswrapper[4749]: E1129 01:35:10.973481 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2\": container with ID starting with 92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2 not found: ID does not exist" containerID="92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.973527 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2"} err="failed to get container status \"92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2\": rpc error: code = NotFound desc = could not find container \"92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2\": container with ID starting with 92dc3bc0a47682b7b079626b63cc26ae211b7debadb23d98dc70fa6168489db2 not found: ID does not exist" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.973551 4749 scope.go:117] "RemoveContainer" containerID="2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57" Nov 29 01:35:10 crc kubenswrapper[4749]: E1129 01:35:10.974712 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57\": container with ID starting with 2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57 not found: ID does not exist" containerID="2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.974781 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57"} err="failed to get container status \"2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57\": rpc error: code = NotFound desc = could not find container \"2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57\": container with ID starting with 2b53db7a91e1f0ff71a34f88bdec2ead692dff45f3cf05ffdfab87868edc3a57 not found: ID does not exist" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.975044 4749 scope.go:117] "RemoveContainer" containerID="03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.987354 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h5vsn"] Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.993015 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.993051 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e34a6bca-7388-4792-81c3-a181558a168f-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.993062 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n54zx\" (UniqueName: \"kubernetes.io/projected/e34a6bca-7388-4792-81c3-a181558a168f-kube-api-access-n54zx\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:10 crc kubenswrapper[4749]: I1129 01:35:10.993071 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34a6bca-7388-4792-81c3-a181558a168f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.009269 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h5vsn"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.032039 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.044848 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.057693 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.058146 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" containerName="nova-scheduler-scheduler" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058159 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" containerName="nova-scheduler-scheduler" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.058180 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-api" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058186 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-api" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.058209 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerName="extract-content" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058217 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerName="extract-content" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.058227 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-log" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058233 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-log" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.058248 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerName="extract-utilities" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058256 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerName="extract-utilities" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.058265 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerName="registry-server" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058271 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerName="registry-server" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058445 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-api" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058455 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" containerName="registry-server" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058470 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34a6bca-7388-4792-81c3-a181558a168f" containerName="nova-api-log" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.058484 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" containerName="nova-scheduler-scheduler" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.059145 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.060932 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.072610 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.072872 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="ceilometer-central-agent" containerID="cri-o://508dd2be6346cf1635cd7779cf10c81bb796918342597ca0af5e28844eb99452" gracePeriod=30 Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.073115 4749 scope.go:117] "RemoveContainer" containerID="930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.073245 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="proxy-httpd" containerID="cri-o://dce0dcf10cdf3973e3c9589cc01bb169f72a8615aa9262621e47766e0365d0ee" gracePeriod=30 Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.073266 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="sg-core" containerID="cri-o://05903097a37c9fa98574c40f91838913a16b798c60e11bd1347643979635c559" gracePeriod=30 Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.073296 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="ceilometer-notification-agent" containerID="cri-o://9d6b37a0cc7a55b62af9b53fbd6409aeac873f1dff484875a19de3e104a7cb9d" gracePeriod=30 Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.093005 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5670797c-5dde-4198-a730-5e5957f3d7d7" path="/var/lib/kubelet/pods/5670797c-5dde-4198-a730-5e5957f3d7d7/volumes" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.095161 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fdeb3d7-c6b2-4065-b553-6a17c92d04ce" path="/var/lib/kubelet/pods/7fdeb3d7-c6b2-4065-b553-6a17c92d04ce/volumes" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.095618 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f479785d-0431-4aaf-88b6-ad9000996a52" path="/var/lib/kubelet/pods/f479785d-0431-4aaf-88b6-ad9000996a52/volumes" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.113923 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: W1129 01:35:11.117272 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf9cf96c_6bdd_425d_8983_4bfa2250edda.slice/crio-b92d1c424ab81556d1e4b3ee820ddd216d8a65d6492810aa143aec4ff410ee2a WatchSource:0}: Error finding container b92d1c424ab81556d1e4b3ee820ddd216d8a65d6492810aa143aec4ff410ee2a: Status 404 returned error can't find the container with id b92d1c424ab81556d1e4b3ee820ddd216d8a65d6492810aa143aec4ff410ee2a Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.129987 4749 scope.go:117] "RemoveContainer" containerID="736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.131894 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.189686 4749 scope.go:117] "RemoveContainer" containerID="03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.190485 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc\": container with ID starting with 03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc not found: ID does not exist" containerID="03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.190521 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc"} err="failed to get container status \"03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc\": rpc error: code = NotFound desc = could not find container \"03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc\": container with ID starting with 03cba146bf99de4362d16342c78da5b705d98fa94d93a91fb0d8b5ebc87309dc not found: ID does not exist" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.190543 4749 scope.go:117] "RemoveContainer" containerID="930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.190837 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a\": container with ID starting with 930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a not found: ID does not exist" containerID="930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.190881 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a"} err="failed to get container status \"930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a\": rpc error: code = NotFound desc = could not find container \"930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a\": container with ID starting with 930243b905a99de8c4e8f66141648b2205d772236a962c8c488f7461cbc0065a not found: ID does not exist" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.190914 4749 scope.go:117] "RemoveContainer" containerID="736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.191414 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6\": container with ID starting with 736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6 not found: ID does not exist" containerID="736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.191439 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6"} err="failed to get container status \"736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6\": rpc error: code = NotFound desc = could not find container \"736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6\": container with ID starting with 736e21bc7cfe5419767f84de3ea701fb7d7ba933a52cbd9b1592d2e5d7a1b1a6 not found: ID does not exist" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.191455 4749 scope.go:117] "RemoveContainer" containerID="c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.196600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2h5t\" (UniqueName: \"kubernetes.io/projected/b7e86f35-22ab-440f-b871-7659ae70bd7c-kube-api-access-b2h5t\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.196668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-config-data\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.196948 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.256639 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.285920 4749 scope.go:117] "RemoveContainer" containerID="c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399" Nov 29 01:35:11 crc kubenswrapper[4749]: E1129 01:35:11.286440 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399\": container with ID starting with c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399 not found: ID does not exist" containerID="c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.286464 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399"} err="failed to get container status \"c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399\": rpc error: code = NotFound desc = could not find container \"c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399\": container with ID starting with c48ea1b9cc9de3a64098633827397462a30485b8fe7aab6f885792a3cb4cd399 not found: ID does not exist" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.299004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2h5t\" (UniqueName: \"kubernetes.io/projected/b7e86f35-22ab-440f-b871-7659ae70bd7c-kube-api-access-b2h5t\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.299082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-config-data\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.299162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.303294 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.304771 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.316917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-config-data\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.326652 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.326735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2h5t\" (UniqueName: \"kubernetes.io/projected/b7e86f35-22ab-440f-b871-7659ae70bd7c-kube-api-access-b2h5t\") pod \"nova-scheduler-0\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.338334 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.341456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.346634 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.347302 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.386503 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.400687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-logs\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.400797 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.400833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxn59\" (UniqueName: \"kubernetes.io/projected/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-kube-api-access-pxn59\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.400857 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-config-data\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.501994 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-logs\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.502089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.502118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxn59\" (UniqueName: \"kubernetes.io/projected/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-kube-api-access-pxn59\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.502142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-config-data\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.504811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-logs\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.512562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.513497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-config-data\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.526422 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxn59\" (UniqueName: \"kubernetes.io/projected/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-kube-api-access-pxn59\") pod \"nova-api-0\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.661007 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.836815 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.958347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7e86f35-22ab-440f-b871-7659ae70bd7c","Type":"ContainerStarted","Data":"3fb133ea780a9c82587fd6c6dc27ae3e88b5f5095af4e9c6d884784e00d01c5a"} Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.959911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bf9cf96c-6bdd-425d-8983-4bfa2250edda","Type":"ContainerStarted","Data":"a92e13de7e8f2651d46dd244243eee9c0810da21b48c9f85f4c7ffacbccd381e"} Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.959935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bf9cf96c-6bdd-425d-8983-4bfa2250edda","Type":"ContainerStarted","Data":"b92d1c424ab81556d1e4b3ee820ddd216d8a65d6492810aa143aec4ff410ee2a"} Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.960584 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.963544 4749 generic.go:334] "Generic (PLEG): container finished" podID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerID="dce0dcf10cdf3973e3c9589cc01bb169f72a8615aa9262621e47766e0365d0ee" exitCode=0 Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.963572 4749 generic.go:334] "Generic (PLEG): container finished" podID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerID="05903097a37c9fa98574c40f91838913a16b798c60e11bd1347643979635c559" exitCode=2 Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.963581 4749 generic.go:334] "Generic (PLEG): container finished" podID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerID="508dd2be6346cf1635cd7779cf10c81bb796918342597ca0af5e28844eb99452" exitCode=0 Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.963630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerDied","Data":"dce0dcf10cdf3973e3c9589cc01bb169f72a8615aa9262621e47766e0365d0ee"} Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.963739 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerDied","Data":"05903097a37c9fa98574c40f91838913a16b798c60e11bd1347643979635c559"} Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.963762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerDied","Data":"508dd2be6346cf1635cd7779cf10c81bb796918342597ca0af5e28844eb99452"} Nov 29 01:35:11 crc kubenswrapper[4749]: I1129 01:35:11.980410 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.635351195 podStartE2EDuration="2.980390373s" podCreationTimestamp="2025-11-29 01:35:09 +0000 UTC" firstStartedPulling="2025-11-29 01:35:11.129902399 +0000 UTC m=+1454.302052256" lastFinishedPulling="2025-11-29 01:35:11.474941577 +0000 UTC m=+1454.647091434" observedRunningTime="2025-11-29 01:35:11.976485326 +0000 UTC m=+1455.148635193" watchObservedRunningTime="2025-11-29 01:35:11.980390373 +0000 UTC m=+1455.152540250" Nov 29 01:35:12 crc kubenswrapper[4749]: W1129 01:35:12.128045 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda06fa48d_bb3d_4cf7_b187_358ecee6d10d.slice/crio-c43d59695c1d49677304e22e2dd9e0a0dd91b23dcd5e7e303816567bee5acdd3 WatchSource:0}: Error finding container c43d59695c1d49677304e22e2dd9e0a0dd91b23dcd5e7e303816567bee5acdd3: Status 404 returned error can't find the container with id c43d59695c1d49677304e22e2dd9e0a0dd91b23dcd5e7e303816567bee5acdd3 Nov 29 01:35:12 crc kubenswrapper[4749]: I1129 01:35:12.143971 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:12 crc kubenswrapper[4749]: I1129 01:35:12.978926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7e86f35-22ab-440f-b871-7659ae70bd7c","Type":"ContainerStarted","Data":"5809fc32e7b9b0080793c6ddca2507282d32e019f797e6a3a4623fbc1225dcd3"} Nov 29 01:35:12 crc kubenswrapper[4749]: I1129 01:35:12.981635 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a06fa48d-bb3d-4cf7-b187-358ecee6d10d","Type":"ContainerStarted","Data":"4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3"} Nov 29 01:35:12 crc kubenswrapper[4749]: I1129 01:35:12.981665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a06fa48d-bb3d-4cf7-b187-358ecee6d10d","Type":"ContainerStarted","Data":"cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2"} Nov 29 01:35:12 crc kubenswrapper[4749]: I1129 01:35:12.981679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a06fa48d-bb3d-4cf7-b187-358ecee6d10d","Type":"ContainerStarted","Data":"c43d59695c1d49677304e22e2dd9e0a0dd91b23dcd5e7e303816567bee5acdd3"} Nov 29 01:35:13 crc kubenswrapper[4749]: I1129 01:35:13.055578 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.05554666 podStartE2EDuration="2.05554666s" podCreationTimestamp="2025-11-29 01:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:13.027188246 +0000 UTC m=+1456.199338103" watchObservedRunningTime="2025-11-29 01:35:13.05554666 +0000 UTC m=+1456.227696517" Nov 29 01:35:13 crc kubenswrapper[4749]: I1129 01:35:13.105176 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e34a6bca-7388-4792-81c3-a181558a168f" path="/var/lib/kubelet/pods/e34a6bca-7388-4792-81c3-a181558a168f/volumes" Nov 29 01:35:13 crc kubenswrapper[4749]: I1129 01:35:13.117653 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.11763182 podStartE2EDuration="2.11763182s" podCreationTimestamp="2025-11-29 01:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:13.08055114 +0000 UTC m=+1456.252701017" watchObservedRunningTime="2025-11-29 01:35:13.11763182 +0000 UTC m=+1456.289781677" Nov 29 01:35:13 crc kubenswrapper[4749]: I1129 01:35:13.554817 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:35:13 crc kubenswrapper[4749]: I1129 01:35:13.607128 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:35:13 crc kubenswrapper[4749]: I1129 01:35:13.993713 4749 generic.go:334] "Generic (PLEG): container finished" podID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerID="9d6b37a0cc7a55b62af9b53fbd6409aeac873f1dff484875a19de3e104a7cb9d" exitCode=0 Nov 29 01:35:13 crc kubenswrapper[4749]: I1129 01:35:13.994644 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerDied","Data":"9d6b37a0cc7a55b62af9b53fbd6409aeac873f1dff484875a19de3e104a7cb9d"} Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.074871 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.157312 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q8gj\" (UniqueName: \"kubernetes.io/projected/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-kube-api-access-8q8gj\") pod \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.157486 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-scripts\") pod \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.157531 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-combined-ca-bundle\") pod \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.157561 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-run-httpd\") pod \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.157615 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-config-data\") pod \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.157653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-log-httpd\") pod \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.157711 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-sg-core-conf-yaml\") pod \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\" (UID: \"bb7bd5a0-4cec-497b-8acc-3ceebc100bed\") " Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.158956 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb7bd5a0-4cec-497b-8acc-3ceebc100bed" (UID: "bb7bd5a0-4cec-497b-8acc-3ceebc100bed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.159378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb7bd5a0-4cec-497b-8acc-3ceebc100bed" (UID: "bb7bd5a0-4cec-497b-8acc-3ceebc100bed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.160151 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.160193 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.163346 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-kube-api-access-8q8gj" (OuterVolumeSpecName: "kube-api-access-8q8gj") pod "bb7bd5a0-4cec-497b-8acc-3ceebc100bed" (UID: "bb7bd5a0-4cec-497b-8acc-3ceebc100bed"). InnerVolumeSpecName "kube-api-access-8q8gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.170040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-scripts" (OuterVolumeSpecName: "scripts") pod "bb7bd5a0-4cec-497b-8acc-3ceebc100bed" (UID: "bb7bd5a0-4cec-497b-8acc-3ceebc100bed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.191684 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb7bd5a0-4cec-497b-8acc-3ceebc100bed" (UID: "bb7bd5a0-4cec-497b-8acc-3ceebc100bed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.245716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb7bd5a0-4cec-497b-8acc-3ceebc100bed" (UID: "bb7bd5a0-4cec-497b-8acc-3ceebc100bed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.271418 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q8gj\" (UniqueName: \"kubernetes.io/projected/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-kube-api-access-8q8gj\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.271454 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.271464 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.271473 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.275872 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-config-data" (OuterVolumeSpecName: "config-data") pod "bb7bd5a0-4cec-497b-8acc-3ceebc100bed" (UID: "bb7bd5a0-4cec-497b-8acc-3ceebc100bed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.276946 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k5ws6"] Nov 29 01:35:14 crc kubenswrapper[4749]: I1129 01:35:14.379794 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb7bd5a0-4cec-497b-8acc-3ceebc100bed-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.005892 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.005893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb7bd5a0-4cec-497b-8acc-3ceebc100bed","Type":"ContainerDied","Data":"bd0634fd128fb1a9dd76d925390a9326ceb90cccf80da049b56f729e306491a1"} Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.005983 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k5ws6" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="registry-server" containerID="cri-o://75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217" gracePeriod=2 Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.006078 4749 scope.go:117] "RemoveContainer" containerID="dce0dcf10cdf3973e3c9589cc01bb169f72a8615aa9262621e47766e0365d0ee" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.053272 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.059761 4749 scope.go:117] "RemoveContainer" containerID="05903097a37c9fa98574c40f91838913a16b798c60e11bd1347643979635c559" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.069347 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.090608 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" path="/var/lib/kubelet/pods/bb7bd5a0-4cec-497b-8acc-3ceebc100bed/volumes" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.091723 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:15 crc kubenswrapper[4749]: E1129 01:35:15.092161 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="ceilometer-notification-agent" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.092184 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="ceilometer-notification-agent" Nov 29 01:35:15 crc kubenswrapper[4749]: E1129 01:35:15.092291 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="ceilometer-central-agent" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.092309 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="ceilometer-central-agent" Nov 29 01:35:15 crc kubenswrapper[4749]: E1129 01:35:15.092321 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="sg-core" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.092330 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="sg-core" Nov 29 01:35:15 crc kubenswrapper[4749]: E1129 01:35:15.092352 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="proxy-httpd" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.092361 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="proxy-httpd" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.092592 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="sg-core" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.092624 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="proxy-httpd" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.092638 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="ceilometer-notification-agent" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.092650 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7bd5a0-4cec-497b-8acc-3ceebc100bed" containerName="ceilometer-central-agent" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.095306 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.100133 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.100845 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.101306 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.101436 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.170440 4749 scope.go:117] "RemoveContainer" containerID="9d6b37a0cc7a55b62af9b53fbd6409aeac873f1dff484875a19de3e104a7cb9d" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.192950 4749 scope.go:117] "RemoveContainer" containerID="508dd2be6346cf1635cd7779cf10c81bb796918342597ca0af5e28844eb99452" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.197252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.197299 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.197320 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.197459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-config-data\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.197518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmndg\" (UniqueName: \"kubernetes.io/projected/0ba12835-85db-4dc8-8a7c-8140157a1b2e-kube-api-access-fmndg\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.197585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-run-httpd\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.197637 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-scripts\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.197741 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-log-httpd\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.300423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-log-httpd\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.300565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.300611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.300636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.300677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-config-data\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.300702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmndg\" (UniqueName: \"kubernetes.io/projected/0ba12835-85db-4dc8-8a7c-8140157a1b2e-kube-api-access-fmndg\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.300736 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-run-httpd\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.300770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-scripts\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.301151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-log-httpd\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.303887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-run-httpd\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.308596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.308957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-scripts\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.309282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.309582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.320878 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmndg\" (UniqueName: \"kubernetes.io/projected/0ba12835-85db-4dc8-8a7c-8140157a1b2e-kube-api-access-fmndg\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.325987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-config-data\") pod \"ceilometer-0\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.443959 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.463929 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.503984 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-catalog-content\") pod \"3a05a7be-7b1f-4883-9e71-df78caa6c977\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.504074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5qg9\" (UniqueName: \"kubernetes.io/projected/3a05a7be-7b1f-4883-9e71-df78caa6c977-kube-api-access-d5qg9\") pod \"3a05a7be-7b1f-4883-9e71-df78caa6c977\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.504144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-utilities\") pod \"3a05a7be-7b1f-4883-9e71-df78caa6c977\" (UID: \"3a05a7be-7b1f-4883-9e71-df78caa6c977\") " Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.505094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-utilities" (OuterVolumeSpecName: "utilities") pod "3a05a7be-7b1f-4883-9e71-df78caa6c977" (UID: "3a05a7be-7b1f-4883-9e71-df78caa6c977"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.508049 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a05a7be-7b1f-4883-9e71-df78caa6c977-kube-api-access-d5qg9" (OuterVolumeSpecName: "kube-api-access-d5qg9") pod "3a05a7be-7b1f-4883-9e71-df78caa6c977" (UID: "3a05a7be-7b1f-4883-9e71-df78caa6c977"). InnerVolumeSpecName "kube-api-access-d5qg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.603027 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a05a7be-7b1f-4883-9e71-df78caa6c977" (UID: "3a05a7be-7b1f-4883-9e71-df78caa6c977"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.606680 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.606726 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5qg9\" (UniqueName: \"kubernetes.io/projected/3a05a7be-7b1f-4883-9e71-df78caa6c977-kube-api-access-d5qg9\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.606744 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a05a7be-7b1f-4883-9e71-df78caa6c977-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.647617 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 01:35:15 crc kubenswrapper[4749]: I1129 01:35:15.647665 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 01:35:16 crc kubenswrapper[4749]: W1129 01:35:16.025226 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ba12835_85db_4dc8_8a7c_8140157a1b2e.slice/crio-141ed00a90063637d4cfbb5a7365be00cef70c5e21f9b30dc1f49b9b4dfa5a8b WatchSource:0}: Error finding container 141ed00a90063637d4cfbb5a7365be00cef70c5e21f9b30dc1f49b9b4dfa5a8b: Status 404 returned error can't find the container with id 141ed00a90063637d4cfbb5a7365be00cef70c5e21f9b30dc1f49b9b4dfa5a8b Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.027892 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.041471 4749 generic.go:334] "Generic (PLEG): container finished" podID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerID="75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217" exitCode=0 Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.041528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5ws6" event={"ID":"3a05a7be-7b1f-4883-9e71-df78caa6c977","Type":"ContainerDied","Data":"75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217"} Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.041554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k5ws6" event={"ID":"3a05a7be-7b1f-4883-9e71-df78caa6c977","Type":"ContainerDied","Data":"0be66b6fbb1b3a2ac2577f7f69766a2eb94cc6e777d3e7d57fdfb9fd1f2a5620"} Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.041577 4749 scope.go:117] "RemoveContainer" containerID="75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.041696 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k5ws6" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.074621 4749 scope.go:117] "RemoveContainer" containerID="f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.085192 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k5ws6"] Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.092784 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k5ws6"] Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.114239 4749 scope.go:117] "RemoveContainer" containerID="a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.142233 4749 scope.go:117] "RemoveContainer" containerID="75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217" Nov 29 01:35:16 crc kubenswrapper[4749]: E1129 01:35:16.142665 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217\": container with ID starting with 75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217 not found: ID does not exist" containerID="75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.142707 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217"} err="failed to get container status \"75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217\": rpc error: code = NotFound desc = could not find container \"75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217\": container with ID starting with 75812cca410f2daccbc15d738d4478e3e1d2dd23ea5da71df1805b0fe80dd217 not found: ID does not exist" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.142733 4749 scope.go:117] "RemoveContainer" containerID="f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465" Nov 29 01:35:16 crc kubenswrapper[4749]: E1129 01:35:16.143556 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465\": container with ID starting with f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465 not found: ID does not exist" containerID="f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.143590 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465"} err="failed to get container status \"f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465\": rpc error: code = NotFound desc = could not find container \"f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465\": container with ID starting with f8898f76abf89bd8b7b5ac50033b8062429acc3760645e358c7d33be4e669465 not found: ID does not exist" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.143605 4749 scope.go:117] "RemoveContainer" containerID="a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a" Nov 29 01:35:16 crc kubenswrapper[4749]: E1129 01:35:16.143805 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a\": container with ID starting with a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a not found: ID does not exist" containerID="a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.143825 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a"} err="failed to get container status \"a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a\": rpc error: code = NotFound desc = could not find container \"a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a\": container with ID starting with a18cac01c57bda108b412f1e75b3bdbd7a205da8d38df056895bc06575d5c35a not found: ID does not exist" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.387384 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.656817 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 01:35:16 crc kubenswrapper[4749]: I1129 01:35:16.657679 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 01:35:17 crc kubenswrapper[4749]: I1129 01:35:17.063549 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerStarted","Data":"26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c"} Nov 29 01:35:17 crc kubenswrapper[4749]: I1129 01:35:17.063597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerStarted","Data":"141ed00a90063637d4cfbb5a7365be00cef70c5e21f9b30dc1f49b9b4dfa5a8b"} Nov 29 01:35:17 crc kubenswrapper[4749]: I1129 01:35:17.088956 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" path="/var/lib/kubelet/pods/3a05a7be-7b1f-4883-9e71-df78caa6c977/volumes" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.080160 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerStarted","Data":"37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f"} Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.691862 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hh4c4"] Nov 29 01:35:18 crc kubenswrapper[4749]: E1129 01:35:18.693039 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="extract-content" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.693071 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="extract-content" Nov 29 01:35:18 crc kubenswrapper[4749]: E1129 01:35:18.693112 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="extract-utilities" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.693125 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="extract-utilities" Nov 29 01:35:18 crc kubenswrapper[4749]: E1129 01:35:18.693166 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="registry-server" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.693179 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="registry-server" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.693514 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a05a7be-7b1f-4883-9e71-df78caa6c977" containerName="registry-server" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.695939 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.704008 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hh4c4"] Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.792900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-catalog-content\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.793026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-utilities\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.793083 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54vvr\" (UniqueName: \"kubernetes.io/projected/83366b90-13b0-41af-8544-bbf07e21615e-kube-api-access-54vvr\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.894716 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-utilities\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.894792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54vvr\" (UniqueName: \"kubernetes.io/projected/83366b90-13b0-41af-8544-bbf07e21615e-kube-api-access-54vvr\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.894910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-catalog-content\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.895423 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-catalog-content\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.895429 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-utilities\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:18 crc kubenswrapper[4749]: I1129 01:35:18.921747 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54vvr\" (UniqueName: \"kubernetes.io/projected/83366b90-13b0-41af-8544-bbf07e21615e-kube-api-access-54vvr\") pod \"redhat-marketplace-hh4c4\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:19 crc kubenswrapper[4749]: I1129 01:35:19.014209 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:19 crc kubenswrapper[4749]: I1129 01:35:19.110051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerStarted","Data":"7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d"} Nov 29 01:35:19 crc kubenswrapper[4749]: W1129 01:35:19.584397 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83366b90_13b0_41af_8544_bbf07e21615e.slice/crio-4edacf64e48ed5ab451c1c3e586f81e4327b2844a2a64314e96ddfcc14de8413 WatchSource:0}: Error finding container 4edacf64e48ed5ab451c1c3e586f81e4327b2844a2a64314e96ddfcc14de8413: Status 404 returned error can't find the container with id 4edacf64e48ed5ab451c1c3e586f81e4327b2844a2a64314e96ddfcc14de8413 Nov 29 01:35:19 crc kubenswrapper[4749]: I1129 01:35:19.586736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hh4c4"] Nov 29 01:35:20 crc kubenswrapper[4749]: I1129 01:35:20.125809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerStarted","Data":"995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e"} Nov 29 01:35:20 crc kubenswrapper[4749]: I1129 01:35:20.126765 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 01:35:20 crc kubenswrapper[4749]: I1129 01:35:20.130167 4749 generic.go:334] "Generic (PLEG): container finished" podID="83366b90-13b0-41af-8544-bbf07e21615e" containerID="57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9" exitCode=0 Nov 29 01:35:20 crc kubenswrapper[4749]: I1129 01:35:20.130362 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh4c4" event={"ID":"83366b90-13b0-41af-8544-bbf07e21615e","Type":"ContainerDied","Data":"57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9"} Nov 29 01:35:20 crc kubenswrapper[4749]: I1129 01:35:20.130503 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh4c4" event={"ID":"83366b90-13b0-41af-8544-bbf07e21615e","Type":"ContainerStarted","Data":"4edacf64e48ed5ab451c1c3e586f81e4327b2844a2a64314e96ddfcc14de8413"} Nov 29 01:35:20 crc kubenswrapper[4749]: I1129 01:35:20.191396 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.578087897 podStartE2EDuration="5.191373864s" podCreationTimestamp="2025-11-29 01:35:15 +0000 UTC" firstStartedPulling="2025-11-29 01:35:16.038225017 +0000 UTC m=+1459.210374874" lastFinishedPulling="2025-11-29 01:35:19.651510984 +0000 UTC m=+1462.823660841" observedRunningTime="2025-11-29 01:35:20.165527633 +0000 UTC m=+1463.337677510" watchObservedRunningTime="2025-11-29 01:35:20.191373864 +0000 UTC m=+1463.363523731" Nov 29 01:35:20 crc kubenswrapper[4749]: I1129 01:35:20.504828 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 29 01:35:21 crc kubenswrapper[4749]: I1129 01:35:21.147661 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh4c4" event={"ID":"83366b90-13b0-41af-8544-bbf07e21615e","Type":"ContainerStarted","Data":"adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3"} Nov 29 01:35:21 crc kubenswrapper[4749]: I1129 01:35:21.386908 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 01:35:21 crc kubenswrapper[4749]: I1129 01:35:21.424421 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 01:35:21 crc kubenswrapper[4749]: I1129 01:35:21.661999 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 01:35:21 crc kubenswrapper[4749]: I1129 01:35:21.662092 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 01:35:22 crc kubenswrapper[4749]: I1129 01:35:22.157776 4749 generic.go:334] "Generic (PLEG): container finished" podID="83366b90-13b0-41af-8544-bbf07e21615e" containerID="adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3" exitCode=0 Nov 29 01:35:22 crc kubenswrapper[4749]: I1129 01:35:22.159784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh4c4" event={"ID":"83366b90-13b0-41af-8544-bbf07e21615e","Type":"ContainerDied","Data":"adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3"} Nov 29 01:35:22 crc kubenswrapper[4749]: I1129 01:35:22.207777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 01:35:22 crc kubenswrapper[4749]: I1129 01:35:22.744392 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 01:35:22 crc kubenswrapper[4749]: I1129 01:35:22.744491 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 01:35:23 crc kubenswrapper[4749]: I1129 01:35:23.170604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh4c4" event={"ID":"83366b90-13b0-41af-8544-bbf07e21615e","Type":"ContainerStarted","Data":"dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9"} Nov 29 01:35:23 crc kubenswrapper[4749]: I1129 01:35:23.196978 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hh4c4" podStartSLOduration=2.65286389 podStartE2EDuration="5.196960049s" podCreationTimestamp="2025-11-29 01:35:18 +0000 UTC" firstStartedPulling="2025-11-29 01:35:20.13352868 +0000 UTC m=+1463.305678547" lastFinishedPulling="2025-11-29 01:35:22.677624849 +0000 UTC m=+1465.849774706" observedRunningTime="2025-11-29 01:35:23.196946669 +0000 UTC m=+1466.369096546" watchObservedRunningTime="2025-11-29 01:35:23.196960049 +0000 UTC m=+1466.369109896" Nov 29 01:35:25 crc kubenswrapper[4749]: I1129 01:35:25.374739 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:35:25 crc kubenswrapper[4749]: I1129 01:35:25.375085 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:35:25 crc kubenswrapper[4749]: I1129 01:35:25.658488 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 01:35:25 crc kubenswrapper[4749]: I1129 01:35:25.660674 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 01:35:25 crc kubenswrapper[4749]: I1129 01:35:25.668062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 01:35:26 crc kubenswrapper[4749]: I1129 01:35:26.217622 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.015063 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.015930 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.091112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.098648 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.232566 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-config-data\") pod \"b9de2381-be42-4964-a9dd-f98253684b5e\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.232754 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-combined-ca-bundle\") pod \"b9de2381-be42-4964-a9dd-f98253684b5e\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.232970 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dl98\" (UniqueName: \"kubernetes.io/projected/b9de2381-be42-4964-a9dd-f98253684b5e-kube-api-access-7dl98\") pod \"b9de2381-be42-4964-a9dd-f98253684b5e\" (UID: \"b9de2381-be42-4964-a9dd-f98253684b5e\") " Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.238781 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9de2381-be42-4964-a9dd-f98253684b5e-kube-api-access-7dl98" (OuterVolumeSpecName: "kube-api-access-7dl98") pod "b9de2381-be42-4964-a9dd-f98253684b5e" (UID: "b9de2381-be42-4964-a9dd-f98253684b5e"). InnerVolumeSpecName "kube-api-access-7dl98". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.267181 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9de2381-be42-4964-a9dd-f98253684b5e" (UID: "b9de2381-be42-4964-a9dd-f98253684b5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.268518 4749 generic.go:334] "Generic (PLEG): container finished" podID="b9de2381-be42-4964-a9dd-f98253684b5e" containerID="81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532" exitCode=137 Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.268720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9de2381-be42-4964-a9dd-f98253684b5e","Type":"ContainerDied","Data":"81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532"} Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.268934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b9de2381-be42-4964-a9dd-f98253684b5e","Type":"ContainerDied","Data":"02ecc5e18ac1260af28f776613b5e2f70c47fd32091cab21f3071bbb04a4c655"} Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.269279 4749 scope.go:117] "RemoveContainer" containerID="81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.269552 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.284306 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-config-data" (OuterVolumeSpecName: "config-data") pod "b9de2381-be42-4964-a9dd-f98253684b5e" (UID: "b9de2381-be42-4964-a9dd-f98253684b5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.336117 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.336372 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de2381-be42-4964-a9dd-f98253684b5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.336468 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dl98\" (UniqueName: \"kubernetes.io/projected/b9de2381-be42-4964-a9dd-f98253684b5e-kube-api-access-7dl98\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.367260 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.377360 4749 scope.go:117] "RemoveContainer" containerID="81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532" Nov 29 01:35:29 crc kubenswrapper[4749]: E1129 01:35:29.379606 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532\": container with ID starting with 81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532 not found: ID does not exist" containerID="81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.379645 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532"} err="failed to get container status \"81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532\": rpc error: code = NotFound desc = could not find container \"81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532\": container with ID starting with 81f673c3ca4a81eab7ec41ead9c0cfbc291003786dbc3954aa6cf5b538e79532 not found: ID does not exist" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.427311 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hh4c4"] Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.620830 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.644262 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.668891 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:35:29 crc kubenswrapper[4749]: E1129 01:35:29.669423 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9de2381-be42-4964-a9dd-f98253684b5e" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.669443 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9de2381-be42-4964-a9dd-f98253684b5e" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.669707 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9de2381-be42-4964-a9dd-f98253684b5e" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.670560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.674838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.675417 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.676087 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.707841 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.745342 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.745452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.745697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.745759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.745940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5v4\" (UniqueName: \"kubernetes.io/projected/823cce34-3656-4edf-9197-5586262263ec-kube-api-access-7g5v4\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.848044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5v4\" (UniqueName: \"kubernetes.io/projected/823cce34-3656-4edf-9197-5586262263ec-kube-api-access-7g5v4\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.848143 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.848214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.848244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.848311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.856042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.856290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.857502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.858325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.871669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5v4\" (UniqueName: \"kubernetes.io/projected/823cce34-3656-4edf-9197-5586262263ec-kube-api-access-7g5v4\") pod \"nova-cell1-novncproxy-0\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:29 crc kubenswrapper[4749]: I1129 01:35:29.999561 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:30 crc kubenswrapper[4749]: I1129 01:35:30.481454 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.095502 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9de2381-be42-4964-a9dd-f98253684b5e" path="/var/lib/kubelet/pods/b9de2381-be42-4964-a9dd-f98253684b5e/volumes" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.295438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"823cce34-3656-4edf-9197-5586262263ec","Type":"ContainerStarted","Data":"5efba0e169acdd879816caeca72282b7bac80064e6baf6bb9765579a9239ea9d"} Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.295806 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"823cce34-3656-4edf-9197-5586262263ec","Type":"ContainerStarted","Data":"74fa2f41261ac17d863e854afffda95ac9ed649b5de4eb8782fbe1c0f96bfe53"} Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.295528 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hh4c4" podUID="83366b90-13b0-41af-8544-bbf07e21615e" containerName="registry-server" containerID="cri-o://dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9" gracePeriod=2 Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.328649 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.328610932 podStartE2EDuration="2.328610932s" podCreationTimestamp="2025-11-29 01:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:31.324956941 +0000 UTC m=+1474.497106798" watchObservedRunningTime="2025-11-29 01:35:31.328610932 +0000 UTC m=+1474.500760829" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.692412 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.693057 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.694438 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.695579 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.834530 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.990824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-utilities\") pod \"83366b90-13b0-41af-8544-bbf07e21615e\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.991871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-utilities" (OuterVolumeSpecName: "utilities") pod "83366b90-13b0-41af-8544-bbf07e21615e" (UID: "83366b90-13b0-41af-8544-bbf07e21615e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.991986 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-catalog-content\") pod \"83366b90-13b0-41af-8544-bbf07e21615e\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.992332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54vvr\" (UniqueName: \"kubernetes.io/projected/83366b90-13b0-41af-8544-bbf07e21615e-kube-api-access-54vvr\") pod \"83366b90-13b0-41af-8544-bbf07e21615e\" (UID: \"83366b90-13b0-41af-8544-bbf07e21615e\") " Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.993285 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:31 crc kubenswrapper[4749]: I1129 01:35:31.999114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83366b90-13b0-41af-8544-bbf07e21615e-kube-api-access-54vvr" (OuterVolumeSpecName: "kube-api-access-54vvr") pod "83366b90-13b0-41af-8544-bbf07e21615e" (UID: "83366b90-13b0-41af-8544-bbf07e21615e"). InnerVolumeSpecName "kube-api-access-54vvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.010855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83366b90-13b0-41af-8544-bbf07e21615e" (UID: "83366b90-13b0-41af-8544-bbf07e21615e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.095009 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83366b90-13b0-41af-8544-bbf07e21615e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.095050 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54vvr\" (UniqueName: \"kubernetes.io/projected/83366b90-13b0-41af-8544-bbf07e21615e-kube-api-access-54vvr\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.316996 4749 generic.go:334] "Generic (PLEG): container finished" podID="83366b90-13b0-41af-8544-bbf07e21615e" containerID="dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9" exitCode=0 Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.322005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh4c4" event={"ID":"83366b90-13b0-41af-8544-bbf07e21615e","Type":"ContainerDied","Data":"dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9"} Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.322092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hh4c4" event={"ID":"83366b90-13b0-41af-8544-bbf07e21615e","Type":"ContainerDied","Data":"4edacf64e48ed5ab451c1c3e586f81e4327b2844a2a64314e96ddfcc14de8413"} Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.322120 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.322124 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hh4c4" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.322142 4749 scope.go:117] "RemoveContainer" containerID="dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.328138 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.391497 4749 scope.go:117] "RemoveContainer" containerID="adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.409409 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hh4c4"] Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.425912 4749 scope.go:117] "RemoveContainer" containerID="57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.426367 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hh4c4"] Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.520717 4749 scope.go:117] "RemoveContainer" containerID="dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9" Nov 29 01:35:32 crc kubenswrapper[4749]: E1129 01:35:32.525326 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9\": container with ID starting with dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9 not found: ID does not exist" containerID="dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.525408 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9"} err="failed to get container status \"dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9\": rpc error: code = NotFound desc = could not find container \"dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9\": container with ID starting with dcc79e39eb97ee0c93eca56c35600f5da94b28514a7f8e379b9755a8a7d4f7f9 not found: ID does not exist" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.525454 4749 scope.go:117] "RemoveContainer" containerID="adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3" Nov 29 01:35:32 crc kubenswrapper[4749]: E1129 01:35:32.528023 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3\": container with ID starting with adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3 not found: ID does not exist" containerID="adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.528078 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3"} err="failed to get container status \"adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3\": rpc error: code = NotFound desc = could not find container \"adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3\": container with ID starting with adeaa47facfa58f0f6dbf3d86574d1dc8f3d4add6b00186b024870a79a1448a3 not found: ID does not exist" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.528112 4749 scope.go:117] "RemoveContainer" containerID="57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9" Nov 29 01:35:32 crc kubenswrapper[4749]: E1129 01:35:32.528783 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9\": container with ID starting with 57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9 not found: ID does not exist" containerID="57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.528837 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9"} err="failed to get container status \"57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9\": rpc error: code = NotFound desc = could not find container \"57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9\": container with ID starting with 57e347684998c77244ca41736980115f5b21630bbd340d9010aef6a10b4d6fe9 not found: ID does not exist" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.606490 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rxq45"] Nov 29 01:35:32 crc kubenswrapper[4749]: E1129 01:35:32.606977 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83366b90-13b0-41af-8544-bbf07e21615e" containerName="extract-utilities" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.606998 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="83366b90-13b0-41af-8544-bbf07e21615e" containerName="extract-utilities" Nov 29 01:35:32 crc kubenswrapper[4749]: E1129 01:35:32.607037 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83366b90-13b0-41af-8544-bbf07e21615e" containerName="registry-server" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.607046 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="83366b90-13b0-41af-8544-bbf07e21615e" containerName="registry-server" Nov 29 01:35:32 crc kubenswrapper[4749]: E1129 01:35:32.607071 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83366b90-13b0-41af-8544-bbf07e21615e" containerName="extract-content" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.607080 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="83366b90-13b0-41af-8544-bbf07e21615e" containerName="extract-content" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.607385 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="83366b90-13b0-41af-8544-bbf07e21615e" containerName="registry-server" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.623169 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rxq45"] Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.623314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.716013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.716184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.716270 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq9k\" (UniqueName: \"kubernetes.io/projected/6565857b-6329-46a9-b8c0-4cad1019c4b9-kube-api-access-pmq9k\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.716292 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.716358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.716421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-config\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.818158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.818271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-config\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.818329 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.818418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.818458 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmq9k\" (UniqueName: \"kubernetes.io/projected/6565857b-6329-46a9-b8c0-4cad1019c4b9-kube-api-access-pmq9k\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.818476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.819791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.820319 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-config\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.820832 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.821458 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.821764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.839540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmq9k\" (UniqueName: \"kubernetes.io/projected/6565857b-6329-46a9-b8c0-4cad1019c4b9-kube-api-access-pmq9k\") pod \"dnsmasq-dns-89c5cd4d5-rxq45\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:32 crc kubenswrapper[4749]: I1129 01:35:32.949694 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:33 crc kubenswrapper[4749]: I1129 01:35:33.091725 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83366b90-13b0-41af-8544-bbf07e21615e" path="/var/lib/kubelet/pods/83366b90-13b0-41af-8544-bbf07e21615e/volumes" Nov 29 01:35:33 crc kubenswrapper[4749]: I1129 01:35:33.441974 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rxq45"] Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.345768 4749 generic.go:334] "Generic (PLEG): container finished" podID="6565857b-6329-46a9-b8c0-4cad1019c4b9" containerID="9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068" exitCode=0 Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.345892 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" event={"ID":"6565857b-6329-46a9-b8c0-4cad1019c4b9","Type":"ContainerDied","Data":"9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068"} Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.346282 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" event={"ID":"6565857b-6329-46a9-b8c0-4cad1019c4b9","Type":"ContainerStarted","Data":"46d703c25d7c0e36de462abdba7a4eec4f3f242271be53b25bfc3c62c5458709"} Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.472434 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.472907 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="ceilometer-central-agent" containerID="cri-o://26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c" gracePeriod=30 Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.473590 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="proxy-httpd" containerID="cri-o://995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e" gracePeriod=30 Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.473652 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="sg-core" containerID="cri-o://7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d" gracePeriod=30 Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.473687 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="ceilometer-notification-agent" containerID="cri-o://37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f" gracePeriod=30 Nov 29 01:35:34 crc kubenswrapper[4749]: I1129 01:35:34.602458 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:34.999902 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.240312 4749 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podabb83ebb-828e-4761-9847-e49672d30be4"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podabb83ebb-828e-4761-9847-e49672d30be4] : Timed out while waiting for systemd to remove kubepods-besteffort-podabb83ebb_828e_4761_9847_e49672d30be4.slice" Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.360966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" event={"ID":"6565857b-6329-46a9-b8c0-4cad1019c4b9","Type":"ContainerStarted","Data":"f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9"} Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.361794 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.367122 4749 generic.go:334] "Generic (PLEG): container finished" podID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerID="995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e" exitCode=0 Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.367144 4749 generic.go:334] "Generic (PLEG): container finished" podID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerID="7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d" exitCode=2 Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.367153 4749 generic.go:334] "Generic (PLEG): container finished" podID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerID="26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c" exitCode=0 Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.367168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerDied","Data":"995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e"} Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.367183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerDied","Data":"7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d"} Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.367209 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerDied","Data":"26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c"} Nov 29 01:35:35 crc kubenswrapper[4749]: I1129 01:35:35.386235 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" podStartSLOduration=3.38621995 podStartE2EDuration="3.38621995s" podCreationTimestamp="2025-11-29 01:35:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:35.381971724 +0000 UTC m=+1478.554121621" watchObservedRunningTime="2025-11-29 01:35:35.38621995 +0000 UTC m=+1478.558369807" Nov 29 01:35:36 crc kubenswrapper[4749]: I1129 01:35:36.442673 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:36 crc kubenswrapper[4749]: I1129 01:35:36.443098 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-log" containerID="cri-o://cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2" gracePeriod=30 Nov 29 01:35:36 crc kubenswrapper[4749]: I1129 01:35:36.443242 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-api" containerID="cri-o://4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3" gracePeriod=30 Nov 29 01:35:37 crc kubenswrapper[4749]: I1129 01:35:37.392928 4749 generic.go:334] "Generic (PLEG): container finished" podID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerID="cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2" exitCode=143 Nov 29 01:35:37 crc kubenswrapper[4749]: I1129 01:35:37.393141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a06fa48d-bb3d-4cf7-b187-358ecee6d10d","Type":"ContainerDied","Data":"cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2"} Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.856031 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.967286 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-run-httpd\") pod \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.967351 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-scripts\") pod \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.967401 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-ceilometer-tls-certs\") pod \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.967785 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0ba12835-85db-4dc8-8a7c-8140157a1b2e" (UID: "0ba12835-85db-4dc8-8a7c-8140157a1b2e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.968927 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmndg\" (UniqueName: \"kubernetes.io/projected/0ba12835-85db-4dc8-8a7c-8140157a1b2e-kube-api-access-fmndg\") pod \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.968960 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-log-httpd\") pod \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.970072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-config-data\") pod \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.970161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-combined-ca-bundle\") pod \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.970337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-sg-core-conf-yaml\") pod \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\" (UID: \"0ba12835-85db-4dc8-8a7c-8140157a1b2e\") " Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.970765 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.975189 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba12835-85db-4dc8-8a7c-8140157a1b2e-kube-api-access-fmndg" (OuterVolumeSpecName: "kube-api-access-fmndg") pod "0ba12835-85db-4dc8-8a7c-8140157a1b2e" (UID: "0ba12835-85db-4dc8-8a7c-8140157a1b2e"). InnerVolumeSpecName "kube-api-access-fmndg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.976606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0ba12835-85db-4dc8-8a7c-8140157a1b2e" (UID: "0ba12835-85db-4dc8-8a7c-8140157a1b2e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:38 crc kubenswrapper[4749]: I1129 01:35:38.985371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-scripts" (OuterVolumeSpecName: "scripts") pod "0ba12835-85db-4dc8-8a7c-8140157a1b2e" (UID: "0ba12835-85db-4dc8-8a7c-8140157a1b2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.017451 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0ba12835-85db-4dc8-8a7c-8140157a1b2e" (UID: "0ba12835-85db-4dc8-8a7c-8140157a1b2e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.044782 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0ba12835-85db-4dc8-8a7c-8140157a1b2e" (UID: "0ba12835-85db-4dc8-8a7c-8140157a1b2e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.072256 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.072286 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.072295 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmndg\" (UniqueName: \"kubernetes.io/projected/0ba12835-85db-4dc8-8a7c-8140157a1b2e-kube-api-access-fmndg\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.072303 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ba12835-85db-4dc8-8a7c-8140157a1b2e-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.072311 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.081136 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ba12835-85db-4dc8-8a7c-8140157a1b2e" (UID: "0ba12835-85db-4dc8-8a7c-8140157a1b2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.090890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-config-data" (OuterVolumeSpecName: "config-data") pod "0ba12835-85db-4dc8-8a7c-8140157a1b2e" (UID: "0ba12835-85db-4dc8-8a7c-8140157a1b2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.174416 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.174447 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ba12835-85db-4dc8-8a7c-8140157a1b2e-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.419611 4749 generic.go:334] "Generic (PLEG): container finished" podID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerID="37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f" exitCode=0 Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.419676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerDied","Data":"37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f"} Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.419954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ba12835-85db-4dc8-8a7c-8140157a1b2e","Type":"ContainerDied","Data":"141ed00a90063637d4cfbb5a7365be00cef70c5e21f9b30dc1f49b9b4dfa5a8b"} Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.419981 4749 scope.go:117] "RemoveContainer" containerID="995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.419703 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.463398 4749 scope.go:117] "RemoveContainer" containerID="7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.512951 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.514630 4749 scope.go:117] "RemoveContainer" containerID="37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.521933 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.540748 4749 scope.go:117] "RemoveContainer" containerID="26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.555476 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:39 crc kubenswrapper[4749]: E1129 01:35:39.555963 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="sg-core" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.555984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="sg-core" Nov 29 01:35:39 crc kubenswrapper[4749]: E1129 01:35:39.556010 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="ceilometer-central-agent" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.556018 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="ceilometer-central-agent" Nov 29 01:35:39 crc kubenswrapper[4749]: E1129 01:35:39.556042 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="ceilometer-notification-agent" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.556049 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="ceilometer-notification-agent" Nov 29 01:35:39 crc kubenswrapper[4749]: E1129 01:35:39.556075 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="proxy-httpd" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.556082 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="proxy-httpd" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.556311 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="ceilometer-notification-agent" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.556342 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="ceilometer-central-agent" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.556356 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="sg-core" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.556379 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" containerName="proxy-httpd" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.558635 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.571666 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.572063 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.572235 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.575035 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.578660 4749 scope.go:117] "RemoveContainer" containerID="995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e" Nov 29 01:35:39 crc kubenswrapper[4749]: E1129 01:35:39.581931 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e\": container with ID starting with 995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e not found: ID does not exist" containerID="995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.581968 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e"} err="failed to get container status \"995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e\": rpc error: code = NotFound desc = could not find container \"995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e\": container with ID starting with 995272a7fa74360a9855486913c589c40ae6a5eac74a737d1736213ffdaeba7e not found: ID does not exist" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.581992 4749 scope.go:117] "RemoveContainer" containerID="7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d" Nov 29 01:35:39 crc kubenswrapper[4749]: E1129 01:35:39.582338 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d\": container with ID starting with 7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d not found: ID does not exist" containerID="7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.582396 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d"} err="failed to get container status \"7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d\": rpc error: code = NotFound desc = could not find container \"7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d\": container with ID starting with 7658d39bab644b6836238be170310333b814fd56b6ec3f32a7b5d6ad8c4ea21d not found: ID does not exist" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.582438 4749 scope.go:117] "RemoveContainer" containerID="37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f" Nov 29 01:35:39 crc kubenswrapper[4749]: E1129 01:35:39.582743 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f\": container with ID starting with 37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f not found: ID does not exist" containerID="37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.582778 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f"} err="failed to get container status \"37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f\": rpc error: code = NotFound desc = could not find container \"37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f\": container with ID starting with 37e4a97c57a30fb0caf1b976438ee34fcf0c1507080d99e150bcb892460ba88f not found: ID does not exist" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.582802 4749 scope.go:117] "RemoveContainer" containerID="26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c" Nov 29 01:35:39 crc kubenswrapper[4749]: E1129 01:35:39.583011 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c\": container with ID starting with 26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c not found: ID does not exist" containerID="26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.583043 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c"} err="failed to get container status \"26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c\": rpc error: code = NotFound desc = could not find container \"26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c\": container with ID starting with 26cbbdf314687229ca8c886933b8800601635c6f78ad53328069f4ccd8b8c74c not found: ID does not exist" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.725691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-config-data\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.725770 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxcn4\" (UniqueName: \"kubernetes.io/projected/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-kube-api-access-pxcn4\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.725846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-log-httpd\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.726009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.726059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-scripts\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.726103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.726174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.726327 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-run-httpd\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.827613 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-log-httpd\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.827684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.827714 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-scripts\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.827747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.827773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.827827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-run-httpd\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.827856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-config-data\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.827890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxcn4\" (UniqueName: \"kubernetes.io/projected/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-kube-api-access-pxcn4\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.830028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-log-httpd\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.831879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-run-httpd\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.834029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.834368 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-config-data\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.834940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-scripts\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.838033 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.847389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.856871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxcn4\" (UniqueName: \"kubernetes.io/projected/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-kube-api-access-pxcn4\") pod \"ceilometer-0\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " pod="openstack/ceilometer-0" Nov 29 01:35:39 crc kubenswrapper[4749]: I1129 01:35:39.885831 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.000030 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.029399 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.138090 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.336627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-logs\") pod \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.336739 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxn59\" (UniqueName: \"kubernetes.io/projected/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-kube-api-access-pxn59\") pod \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.336785 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-combined-ca-bundle\") pod \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.337048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-config-data\") pod \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\" (UID: \"a06fa48d-bb3d-4cf7-b187-358ecee6d10d\") " Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.337885 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-logs" (OuterVolumeSpecName: "logs") pod "a06fa48d-bb3d-4cf7-b187-358ecee6d10d" (UID: "a06fa48d-bb3d-4cf7-b187-358ecee6d10d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.345304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-kube-api-access-pxn59" (OuterVolumeSpecName: "kube-api-access-pxn59") pod "a06fa48d-bb3d-4cf7-b187-358ecee6d10d" (UID: "a06fa48d-bb3d-4cf7-b187-358ecee6d10d"). InnerVolumeSpecName "kube-api-access-pxn59". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.371362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a06fa48d-bb3d-4cf7-b187-358ecee6d10d" (UID: "a06fa48d-bb3d-4cf7-b187-358ecee6d10d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.374885 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-config-data" (OuterVolumeSpecName: "config-data") pod "a06fa48d-bb3d-4cf7-b187-358ecee6d10d" (UID: "a06fa48d-bb3d-4cf7-b187-358ecee6d10d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.436503 4749 generic.go:334] "Generic (PLEG): container finished" podID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerID="4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3" exitCode=0 Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.436564 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.436586 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a06fa48d-bb3d-4cf7-b187-358ecee6d10d","Type":"ContainerDied","Data":"4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3"} Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.438443 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a06fa48d-bb3d-4cf7-b187-358ecee6d10d","Type":"ContainerDied","Data":"c43d59695c1d49677304e22e2dd9e0a0dd91b23dcd5e7e303816567bee5acdd3"} Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.438499 4749 scope.go:117] "RemoveContainer" containerID="4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.440241 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxn59\" (UniqueName: \"kubernetes.io/projected/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-kube-api-access-pxn59\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.443285 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.443452 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.443573 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a06fa48d-bb3d-4cf7-b187-358ecee6d10d-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.462746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.462998 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.557067 4749 scope.go:117] "RemoveContainer" containerID="cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.580984 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.591457 4749 scope.go:117] "RemoveContainer" containerID="4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3" Nov 29 01:35:40 crc kubenswrapper[4749]: E1129 01:35:40.592016 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3\": container with ID starting with 4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3 not found: ID does not exist" containerID="4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.592103 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3"} err="failed to get container status \"4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3\": rpc error: code = NotFound desc = could not find container \"4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3\": container with ID starting with 4bdb52ab9379e02bd881c2770f5d97bcf34573e8dbb702704d105c058325e2e3 not found: ID does not exist" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.592132 4749 scope.go:117] "RemoveContainer" containerID="cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2" Nov 29 01:35:40 crc kubenswrapper[4749]: E1129 01:35:40.594864 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2\": container with ID starting with cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2 not found: ID does not exist" containerID="cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.594904 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2"} err="failed to get container status \"cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2\": rpc error: code = NotFound desc = could not find container \"cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2\": container with ID starting with cd28dfe0c57a3a000d92d1db7b64a93551b8f9cb6fe2121f98cf000d89d018d2 not found: ID does not exist" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.601264 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.617609 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:40 crc kubenswrapper[4749]: E1129 01:35:40.617991 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-api" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.618009 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-api" Nov 29 01:35:40 crc kubenswrapper[4749]: E1129 01:35:40.618047 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-log" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.618053 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-log" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.618280 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-api" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.618314 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" containerName="nova-api-log" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.619226 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.621929 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.622817 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.622950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.627609 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.646600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-config-data\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.646664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a19c5-3289-4637-8d38-1d758496105d-logs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.646694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.646732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qct\" (UniqueName: \"kubernetes.io/projected/411a19c5-3289-4637-8d38-1d758496105d-kube-api-access-l9qct\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.646789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.646811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-public-tls-certs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.714073 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7pbnc"] Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.721409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.729676 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.730300 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.734663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7pbnc"] Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.750906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.750954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-public-tls-certs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-config-data\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751027 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-config-data\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751060 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a19c5-3289-4637-8d38-1d758496105d-logs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-scripts\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751157 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9qct\" (UniqueName: \"kubernetes.io/projected/411a19c5-3289-4637-8d38-1d758496105d-kube-api-access-l9qct\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgrc6\" (UniqueName: \"kubernetes.io/projected/1a9e0c24-080d-40d1-a0b5-e796627265e7-kube-api-access-qgrc6\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.751630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a19c5-3289-4637-8d38-1d758496105d-logs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.756574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.759651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.763330 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-public-tls-certs\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.772858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-config-data\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.777211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9qct\" (UniqueName: \"kubernetes.io/projected/411a19c5-3289-4637-8d38-1d758496105d-kube-api-access-l9qct\") pod \"nova-api-0\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " pod="openstack/nova-api-0" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.852698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-config-data\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.852762 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-scripts\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.852790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.852871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrc6\" (UniqueName: \"kubernetes.io/projected/1a9e0c24-080d-40d1-a0b5-e796627265e7-kube-api-access-qgrc6\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.856705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-config-data\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.856827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-scripts\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.857582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.867968 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrc6\" (UniqueName: \"kubernetes.io/projected/1a9e0c24-080d-40d1-a0b5-e796627265e7-kube-api-access-qgrc6\") pod \"nova-cell1-cell-mapping-7pbnc\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:40 crc kubenswrapper[4749]: I1129 01:35:40.942668 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:41 crc kubenswrapper[4749]: I1129 01:35:41.046108 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:41 crc kubenswrapper[4749]: I1129 01:35:41.095903 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba12835-85db-4dc8-8a7c-8140157a1b2e" path="/var/lib/kubelet/pods/0ba12835-85db-4dc8-8a7c-8140157a1b2e/volumes" Nov 29 01:35:41 crc kubenswrapper[4749]: I1129 01:35:41.097295 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06fa48d-bb3d-4cf7-b187-358ecee6d10d" path="/var/lib/kubelet/pods/a06fa48d-bb3d-4cf7-b187-358ecee6d10d/volumes" Nov 29 01:35:41 crc kubenswrapper[4749]: I1129 01:35:41.437106 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:41 crc kubenswrapper[4749]: I1129 01:35:41.450397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerStarted","Data":"07a474eb6ac2724e1b0cdecbb5da046a4117a4bce874254d62a90db087962686"} Nov 29 01:35:41 crc kubenswrapper[4749]: I1129 01:35:41.450446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerStarted","Data":"4359ec511f961657d2664590717f11dbb2e8947c2d0bff9fea5d94067b10e986"} Nov 29 01:35:41 crc kubenswrapper[4749]: I1129 01:35:41.558571 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7pbnc"] Nov 29 01:35:41 crc kubenswrapper[4749]: W1129 01:35:41.561087 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a9e0c24_080d_40d1_a0b5_e796627265e7.slice/crio-21e5a067fc53fb9a29c0a92926d0a3f33f19a5ef7b49f2001c15eecd05e2cf02 WatchSource:0}: Error finding container 21e5a067fc53fb9a29c0a92926d0a3f33f19a5ef7b49f2001c15eecd05e2cf02: Status 404 returned error can't find the container with id 21e5a067fc53fb9a29c0a92926d0a3f33f19a5ef7b49f2001c15eecd05e2cf02 Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.464238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411a19c5-3289-4637-8d38-1d758496105d","Type":"ContainerStarted","Data":"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7"} Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.464655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411a19c5-3289-4637-8d38-1d758496105d","Type":"ContainerStarted","Data":"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c"} Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.464669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411a19c5-3289-4637-8d38-1d758496105d","Type":"ContainerStarted","Data":"e26fd635d7d852698932f04cad6fe2bdcb9af4e93beba88096de574f7478a221"} Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.466938 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerStarted","Data":"5fea5dc22f78f4b44307aafccca2e0586ca8c1965943165df879437b824209c1"} Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.468771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7pbnc" event={"ID":"1a9e0c24-080d-40d1-a0b5-e796627265e7","Type":"ContainerStarted","Data":"71b2e244af901dca544d387980cde284750449022d8bbea0c7e5b1dfa19f0e59"} Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.468870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7pbnc" event={"ID":"1a9e0c24-080d-40d1-a0b5-e796627265e7","Type":"ContainerStarted","Data":"21e5a067fc53fb9a29c0a92926d0a3f33f19a5ef7b49f2001c15eecd05e2cf02"} Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.507392 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7pbnc" podStartSLOduration=2.50737452 podStartE2EDuration="2.50737452s" podCreationTimestamp="2025-11-29 01:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:42.503091294 +0000 UTC m=+1485.675241161" watchObservedRunningTime="2025-11-29 01:35:42.50737452 +0000 UTC m=+1485.679524387" Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.511140 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.511131853 podStartE2EDuration="2.511131853s" podCreationTimestamp="2025-11-29 01:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:42.483545049 +0000 UTC m=+1485.655694906" watchObservedRunningTime="2025-11-29 01:35:42.511131853 +0000 UTC m=+1485.683281720" Nov 29 01:35:42 crc kubenswrapper[4749]: I1129 01:35:42.951379 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.015910 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-g5bnn"] Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.016187 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" podUID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerName="dnsmasq-dns" containerID="cri-o://fe391866b222bdc27b02c092a42376ca48bf580b3de4ed28527aad1c47cfa12b" gracePeriod=10 Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.485095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerStarted","Data":"9eed71d2ea4b0dda3706b093b511a6005492203beb6c1a0d7d09b12729581237"} Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.487600 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerID="fe391866b222bdc27b02c092a42376ca48bf580b3de4ed28527aad1c47cfa12b" exitCode=0 Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.488463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" event={"ID":"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d","Type":"ContainerDied","Data":"fe391866b222bdc27b02c092a42376ca48bf580b3de4ed28527aad1c47cfa12b"} Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.488491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" event={"ID":"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d","Type":"ContainerDied","Data":"9475d37a13b715892df79770e9c1221c607faed7b71b2dc0c09f8bf0891f0f85"} Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.488502 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9475d37a13b715892df79770e9c1221c607faed7b71b2dc0c09f8bf0891f0f85" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.558470 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.654897 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-sb\") pod \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.654945 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-svc\") pod \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.654979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-config\") pod \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.655048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58t2t\" (UniqueName: \"kubernetes.io/projected/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-kube-api-access-58t2t\") pod \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.655142 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-swift-storage-0\") pod \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.655234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-nb\") pod \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\" (UID: \"6a4ce63a-fec4-4ffa-8bd3-b68343676e4d\") " Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.675587 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-kube-api-access-58t2t" (OuterVolumeSpecName: "kube-api-access-58t2t") pod "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" (UID: "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d"). InnerVolumeSpecName "kube-api-access-58t2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.706662 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" (UID: "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.707634 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-config" (OuterVolumeSpecName: "config") pod "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" (UID: "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.710596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" (UID: "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.715662 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" (UID: "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.730590 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" (UID: "6a4ce63a-fec4-4ffa-8bd3-b68343676e4d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.756935 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58t2t\" (UniqueName: \"kubernetes.io/projected/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-kube-api-access-58t2t\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.756969 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.756980 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.756988 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.756996 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:43 crc kubenswrapper[4749]: I1129 01:35:43.757005 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:44 crc kubenswrapper[4749]: I1129 01:35:44.499702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerStarted","Data":"b6161f4f801200f63a69f8699a09c9b84f3a1ded5a4c508dc41de08731035fec"} Nov 29 01:35:44 crc kubenswrapper[4749]: I1129 01:35:44.500067 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 01:35:44 crc kubenswrapper[4749]: I1129 01:35:44.499737 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" Nov 29 01:35:44 crc kubenswrapper[4749]: I1129 01:35:44.522925 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.851026519 podStartE2EDuration="5.52290713s" podCreationTimestamp="2025-11-29 01:35:39 +0000 UTC" firstStartedPulling="2025-11-29 01:35:40.47043989 +0000 UTC m=+1483.642589747" lastFinishedPulling="2025-11-29 01:35:44.142320501 +0000 UTC m=+1487.314470358" observedRunningTime="2025-11-29 01:35:44.516104072 +0000 UTC m=+1487.688253929" watchObservedRunningTime="2025-11-29 01:35:44.52290713 +0000 UTC m=+1487.695056987" Nov 29 01:35:44 crc kubenswrapper[4749]: I1129 01:35:44.556899 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-g5bnn"] Nov 29 01:35:44 crc kubenswrapper[4749]: I1129 01:35:44.565816 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-g5bnn"] Nov 29 01:35:45 crc kubenswrapper[4749]: I1129 01:35:45.088405 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" path="/var/lib/kubelet/pods/6a4ce63a-fec4-4ffa-8bd3-b68343676e4d/volumes" Nov 29 01:35:47 crc kubenswrapper[4749]: I1129 01:35:47.549052 4749 generic.go:334] "Generic (PLEG): container finished" podID="1a9e0c24-080d-40d1-a0b5-e796627265e7" containerID="71b2e244af901dca544d387980cde284750449022d8bbea0c7e5b1dfa19f0e59" exitCode=0 Nov 29 01:35:47 crc kubenswrapper[4749]: I1129 01:35:47.549658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7pbnc" event={"ID":"1a9e0c24-080d-40d1-a0b5-e796627265e7","Type":"ContainerDied","Data":"71b2e244af901dca544d387980cde284750449022d8bbea0c7e5b1dfa19f0e59"} Nov 29 01:35:48 crc kubenswrapper[4749]: I1129 01:35:48.406509 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-g5bnn" podUID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.185:5353: i/o timeout" Nov 29 01:35:48 crc kubenswrapper[4749]: I1129 01:35:48.917960 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.097784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-combined-ca-bundle\") pod \"1a9e0c24-080d-40d1-a0b5-e796627265e7\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.097951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-scripts\") pod \"1a9e0c24-080d-40d1-a0b5-e796627265e7\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.098036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgrc6\" (UniqueName: \"kubernetes.io/projected/1a9e0c24-080d-40d1-a0b5-e796627265e7-kube-api-access-qgrc6\") pod \"1a9e0c24-080d-40d1-a0b5-e796627265e7\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.098309 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-config-data\") pod \"1a9e0c24-080d-40d1-a0b5-e796627265e7\" (UID: \"1a9e0c24-080d-40d1-a0b5-e796627265e7\") " Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.104759 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9e0c24-080d-40d1-a0b5-e796627265e7-kube-api-access-qgrc6" (OuterVolumeSpecName: "kube-api-access-qgrc6") pod "1a9e0c24-080d-40d1-a0b5-e796627265e7" (UID: "1a9e0c24-080d-40d1-a0b5-e796627265e7"). InnerVolumeSpecName "kube-api-access-qgrc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.106650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-scripts" (OuterVolumeSpecName: "scripts") pod "1a9e0c24-080d-40d1-a0b5-e796627265e7" (UID: "1a9e0c24-080d-40d1-a0b5-e796627265e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.150862 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a9e0c24-080d-40d1-a0b5-e796627265e7" (UID: "1a9e0c24-080d-40d1-a0b5-e796627265e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.160722 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-config-data" (OuterVolumeSpecName: "config-data") pod "1a9e0c24-080d-40d1-a0b5-e796627265e7" (UID: "1a9e0c24-080d-40d1-a0b5-e796627265e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.200722 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.200766 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgrc6\" (UniqueName: \"kubernetes.io/projected/1a9e0c24-080d-40d1-a0b5-e796627265e7-kube-api-access-qgrc6\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.200781 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.200797 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9e0c24-080d-40d1-a0b5-e796627265e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.572461 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7pbnc" event={"ID":"1a9e0c24-080d-40d1-a0b5-e796627265e7","Type":"ContainerDied","Data":"21e5a067fc53fb9a29c0a92926d0a3f33f19a5ef7b49f2001c15eecd05e2cf02"} Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.572866 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e5a067fc53fb9a29c0a92926d0a3f33f19a5ef7b49f2001c15eecd05e2cf02" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.572591 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7pbnc" Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.764294 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.764574 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="411a19c5-3289-4637-8d38-1d758496105d" containerName="nova-api-log" containerID="cri-o://8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c" gracePeriod=30 Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.764626 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="411a19c5-3289-4637-8d38-1d758496105d" containerName="nova-api-api" containerID="cri-o://9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7" gracePeriod=30 Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.779254 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.779515 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b7e86f35-22ab-440f-b871-7659ae70bd7c" containerName="nova-scheduler-scheduler" containerID="cri-o://5809fc32e7b9b0080793c6ddca2507282d32e019f797e6a3a4623fbc1225dcd3" gracePeriod=30 Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.846443 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.846729 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-log" containerID="cri-o://bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058" gracePeriod=30 Nov 29 01:35:49 crc kubenswrapper[4749]: I1129 01:35:49.847017 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-metadata" containerID="cri-o://016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1" gracePeriod=30 Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.355168 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.524127 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-public-tls-certs\") pod \"411a19c5-3289-4637-8d38-1d758496105d\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.524165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-internal-tls-certs\") pod \"411a19c5-3289-4637-8d38-1d758496105d\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.524310 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9qct\" (UniqueName: \"kubernetes.io/projected/411a19c5-3289-4637-8d38-1d758496105d-kube-api-access-l9qct\") pod \"411a19c5-3289-4637-8d38-1d758496105d\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.524390 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-combined-ca-bundle\") pod \"411a19c5-3289-4637-8d38-1d758496105d\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.524443 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a19c5-3289-4637-8d38-1d758496105d-logs\") pod \"411a19c5-3289-4637-8d38-1d758496105d\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.524481 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-config-data\") pod \"411a19c5-3289-4637-8d38-1d758496105d\" (UID: \"411a19c5-3289-4637-8d38-1d758496105d\") " Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.524808 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/411a19c5-3289-4637-8d38-1d758496105d-logs" (OuterVolumeSpecName: "logs") pod "411a19c5-3289-4637-8d38-1d758496105d" (UID: "411a19c5-3289-4637-8d38-1d758496105d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.525298 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a19c5-3289-4637-8d38-1d758496105d-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.546575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411a19c5-3289-4637-8d38-1d758496105d-kube-api-access-l9qct" (OuterVolumeSpecName: "kube-api-access-l9qct") pod "411a19c5-3289-4637-8d38-1d758496105d" (UID: "411a19c5-3289-4637-8d38-1d758496105d"). InnerVolumeSpecName "kube-api-access-l9qct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.568922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "411a19c5-3289-4637-8d38-1d758496105d" (UID: "411a19c5-3289-4637-8d38-1d758496105d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.576182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-config-data" (OuterVolumeSpecName: "config-data") pod "411a19c5-3289-4637-8d38-1d758496105d" (UID: "411a19c5-3289-4637-8d38-1d758496105d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.584922 4749 generic.go:334] "Generic (PLEG): container finished" podID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerID="bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058" exitCode=143 Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.585005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7013e94a-6bc7-4cda-9577-117ca35a6024","Type":"ContainerDied","Data":"bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058"} Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.587142 4749 generic.go:334] "Generic (PLEG): container finished" podID="b7e86f35-22ab-440f-b871-7659ae70bd7c" containerID="5809fc32e7b9b0080793c6ddca2507282d32e019f797e6a3a4623fbc1225dcd3" exitCode=0 Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.587217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7e86f35-22ab-440f-b871-7659ae70bd7c","Type":"ContainerDied","Data":"5809fc32e7b9b0080793c6ddca2507282d32e019f797e6a3a4623fbc1225dcd3"} Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.588832 4749 generic.go:334] "Generic (PLEG): container finished" podID="411a19c5-3289-4637-8d38-1d758496105d" containerID="9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7" exitCode=0 Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.588856 4749 generic.go:334] "Generic (PLEG): container finished" podID="411a19c5-3289-4637-8d38-1d758496105d" containerID="8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c" exitCode=143 Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.588872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411a19c5-3289-4637-8d38-1d758496105d","Type":"ContainerDied","Data":"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7"} Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.588889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411a19c5-3289-4637-8d38-1d758496105d","Type":"ContainerDied","Data":"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c"} Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.588901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"411a19c5-3289-4637-8d38-1d758496105d","Type":"ContainerDied","Data":"e26fd635d7d852698932f04cad6fe2bdcb9af4e93beba88096de574f7478a221"} Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.588917 4749 scope.go:117] "RemoveContainer" containerID="9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.589058 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.595024 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "411a19c5-3289-4637-8d38-1d758496105d" (UID: "411a19c5-3289-4637-8d38-1d758496105d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.596619 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "411a19c5-3289-4637-8d38-1d758496105d" (UID: "411a19c5-3289-4637-8d38-1d758496105d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:50 crc kubenswrapper[4749]: I1129 01:35:50.625858 4749 scope.go:117] "RemoveContainer" containerID="8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.628915 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9qct\" (UniqueName: \"kubernetes.io/projected/411a19c5-3289-4637-8d38-1d758496105d-kube-api-access-l9qct\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.628938 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.628948 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.628957 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.628965 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a19c5-3289-4637-8d38-1d758496105d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.652027 4749 scope.go:117] "RemoveContainer" containerID="9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7" Nov 29 01:35:51 crc kubenswrapper[4749]: E1129 01:35:50.652502 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7\": container with ID starting with 9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7 not found: ID does not exist" containerID="9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.652531 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7"} err="failed to get container status \"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7\": rpc error: code = NotFound desc = could not find container \"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7\": container with ID starting with 9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7 not found: ID does not exist" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.652562 4749 scope.go:117] "RemoveContainer" containerID="8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c" Nov 29 01:35:51 crc kubenswrapper[4749]: E1129 01:35:50.653011 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c\": container with ID starting with 8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c not found: ID does not exist" containerID="8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.653035 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c"} err="failed to get container status \"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c\": rpc error: code = NotFound desc = could not find container \"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c\": container with ID starting with 8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c not found: ID does not exist" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.653052 4749 scope.go:117] "RemoveContainer" containerID="9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.653315 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7"} err="failed to get container status \"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7\": rpc error: code = NotFound desc = could not find container \"9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7\": container with ID starting with 9691a259ba076531e5085c7f76b50b5470cc46a4fcc1188eea0ae5ac4226bec7 not found: ID does not exist" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.653333 4749 scope.go:117] "RemoveContainer" containerID="8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.654515 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c"} err="failed to get container status \"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c\": rpc error: code = NotFound desc = could not find container \"8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c\": container with ID starting with 8fef964c499c99576750db4ae592d83aefc9398a42c67994aaffb1de93522e5c not found: ID does not exist" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.764154 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.933484 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-combined-ca-bundle\") pod \"b7e86f35-22ab-440f-b871-7659ae70bd7c\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.933745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2h5t\" (UniqueName: \"kubernetes.io/projected/b7e86f35-22ab-440f-b871-7659ae70bd7c-kube-api-access-b2h5t\") pod \"b7e86f35-22ab-440f-b871-7659ae70bd7c\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.933896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-config-data\") pod \"b7e86f35-22ab-440f-b871-7659ae70bd7c\" (UID: \"b7e86f35-22ab-440f-b871-7659ae70bd7c\") " Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.943899 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e86f35-22ab-440f-b871-7659ae70bd7c-kube-api-access-b2h5t" (OuterVolumeSpecName: "kube-api-access-b2h5t") pod "b7e86f35-22ab-440f-b871-7659ae70bd7c" (UID: "b7e86f35-22ab-440f-b871-7659ae70bd7c"). InnerVolumeSpecName "kube-api-access-b2h5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.960144 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.970756 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-config-data" (OuterVolumeSpecName: "config-data") pod "b7e86f35-22ab-440f-b871-7659ae70bd7c" (UID: "b7e86f35-22ab-440f-b871-7659ae70bd7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.985396 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:50.988901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7e86f35-22ab-440f-b871-7659ae70bd7c" (UID: "b7e86f35-22ab-440f-b871-7659ae70bd7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.021509 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: E1129 01:35:51.022382 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411a19c5-3289-4637-8d38-1d758496105d" containerName="nova-api-api" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022418 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="411a19c5-3289-4637-8d38-1d758496105d" containerName="nova-api-api" Nov 29 01:35:51 crc kubenswrapper[4749]: E1129 01:35:51.022432 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e86f35-22ab-440f-b871-7659ae70bd7c" containerName="nova-scheduler-scheduler" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022438 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e86f35-22ab-440f-b871-7659ae70bd7c" containerName="nova-scheduler-scheduler" Nov 29 01:35:51 crc kubenswrapper[4749]: E1129 01:35:51.022463 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9e0c24-080d-40d1-a0b5-e796627265e7" containerName="nova-manage" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022469 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9e0c24-080d-40d1-a0b5-e796627265e7" containerName="nova-manage" Nov 29 01:35:51 crc kubenswrapper[4749]: E1129 01:35:51.022480 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerName="init" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022486 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerName="init" Nov 29 01:35:51 crc kubenswrapper[4749]: E1129 01:35:51.022508 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411a19c5-3289-4637-8d38-1d758496105d" containerName="nova-api-log" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022513 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="411a19c5-3289-4637-8d38-1d758496105d" containerName="nova-api-log" Nov 29 01:35:51 crc kubenswrapper[4749]: E1129 01:35:51.022545 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerName="dnsmasq-dns" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022552 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerName="dnsmasq-dns" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022871 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e86f35-22ab-440f-b871-7659ae70bd7c" containerName="nova-scheduler-scheduler" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022892 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="411a19c5-3289-4637-8d38-1d758496105d" containerName="nova-api-api" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022919 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="411a19c5-3289-4637-8d38-1d758496105d" containerName="nova-api-log" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022938 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4ce63a-fec4-4ffa-8bd3-b68343676e4d" containerName="dnsmasq-dns" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.022949 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9e0c24-080d-40d1-a0b5-e796627265e7" containerName="nova-manage" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.025023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.028067 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.028311 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.028365 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.032692 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.036124 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2h5t\" (UniqueName: \"kubernetes.io/projected/b7e86f35-22ab-440f-b871-7659ae70bd7c-kube-api-access-b2h5t\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.036154 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.036164 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7e86f35-22ab-440f-b871-7659ae70bd7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.088403 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411a19c5-3289-4637-8d38-1d758496105d" path="/var/lib/kubelet/pods/411a19c5-3289-4637-8d38-1d758496105d/volumes" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.137302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91885d5-497a-41e4-9796-ca25f184b178-logs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.137715 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwbjr\" (UniqueName: \"kubernetes.io/projected/f91885d5-497a-41e4-9796-ca25f184b178-kube-api-access-pwbjr\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.137810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.137846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-public-tls-certs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.137866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.137888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-config-data\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.239382 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwbjr\" (UniqueName: \"kubernetes.io/projected/f91885d5-497a-41e4-9796-ca25f184b178-kube-api-access-pwbjr\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.239439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.239474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-public-tls-certs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.239495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.239515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-config-data\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.239762 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91885d5-497a-41e4-9796-ca25f184b178-logs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.240601 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91885d5-497a-41e4-9796-ca25f184b178-logs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.244999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.245041 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.248538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-config-data\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.249699 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-public-tls-certs\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.258752 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwbjr\" (UniqueName: \"kubernetes.io/projected/f91885d5-497a-41e4-9796-ca25f184b178-kube-api-access-pwbjr\") pod \"nova-api-0\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.362669 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.620470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7e86f35-22ab-440f-b871-7659ae70bd7c","Type":"ContainerDied","Data":"3fb133ea780a9c82587fd6c6dc27ae3e88b5f5095af4e9c6d884784e00d01c5a"} Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.620626 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.620839 4749 scope.go:117] "RemoveContainer" containerID="5809fc32e7b9b0080793c6ddca2507282d32e019f797e6a3a4623fbc1225dcd3" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.668484 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.679571 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.690681 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.691994 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.701501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.706132 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.852025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.852427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-config-data\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.852514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqlvg\" (UniqueName: \"kubernetes.io/projected/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-kube-api-access-kqlvg\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.882671 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:35:51 crc kubenswrapper[4749]: W1129 01:35:51.890385 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf91885d5_497a_41e4_9796_ca25f184b178.slice/crio-f2b443d970a95528d88d248d726b79e84330d35a18bee21c818cbecd0765b0bf WatchSource:0}: Error finding container f2b443d970a95528d88d248d726b79e84330d35a18bee21c818cbecd0765b0bf: Status 404 returned error can't find the container with id f2b443d970a95528d88d248d726b79e84330d35a18bee21c818cbecd0765b0bf Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.954383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.954752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-config-data\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.954788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqlvg\" (UniqueName: \"kubernetes.io/projected/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-kube-api-access-kqlvg\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.959955 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-config-data\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.960130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:51 crc kubenswrapper[4749]: I1129 01:35:51.972252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqlvg\" (UniqueName: \"kubernetes.io/projected/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-kube-api-access-kqlvg\") pod \"nova-scheduler-0\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " pod="openstack/nova-scheduler-0" Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.022794 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:35:52 crc kubenswrapper[4749]: W1129 01:35:52.491968 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5792ec0_0d00_47e7_8d9d_d3133cd1e695.slice/crio-a8a9c886da8ff00a85d21209001adbbe25ae6a312e2dd9fdc5c169207ccff283 WatchSource:0}: Error finding container a8a9c886da8ff00a85d21209001adbbe25ae6a312e2dd9fdc5c169207ccff283: Status 404 returned error can't find the container with id a8a9c886da8ff00a85d21209001adbbe25ae6a312e2dd9fdc5c169207ccff283 Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.500048 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.643061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5792ec0-0d00-47e7-8d9d-d3133cd1e695","Type":"ContainerStarted","Data":"a8a9c886da8ff00a85d21209001adbbe25ae6a312e2dd9fdc5c169207ccff283"} Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.645324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f91885d5-497a-41e4-9796-ca25f184b178","Type":"ContainerStarted","Data":"42f0d632bd584d2a9e5b8e0e0d9b68c2c2cb9b1edeb4d0f8370b7e18d29880ce"} Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.645375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f91885d5-497a-41e4-9796-ca25f184b178","Type":"ContainerStarted","Data":"2156bc219030d070a0c4da6cd4ca235e2905d2b767d2af6d322ac73d528b20d1"} Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.645392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f91885d5-497a-41e4-9796-ca25f184b178","Type":"ContainerStarted","Data":"f2b443d970a95528d88d248d726b79e84330d35a18bee21c818cbecd0765b0bf"} Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.676032 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.676014016 podStartE2EDuration="2.676014016s" podCreationTimestamp="2025-11-29 01:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:52.664668174 +0000 UTC m=+1495.836818041" watchObservedRunningTime="2025-11-29 01:35:52.676014016 +0000 UTC m=+1495.848163873" Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.989158 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:46854->10.217.0.189:8775: read: connection reset by peer" Nov 29 01:35:52 crc kubenswrapper[4749]: I1129 01:35:52.989294 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:46864->10.217.0.189:8775: read: connection reset by peer" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.094837 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e86f35-22ab-440f-b871-7659ae70bd7c" path="/var/lib/kubelet/pods/b7e86f35-22ab-440f-b871-7659ae70bd7c/volumes" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.465185 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.590567 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-nova-metadata-tls-certs\") pod \"7013e94a-6bc7-4cda-9577-117ca35a6024\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.590769 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7013e94a-6bc7-4cda-9577-117ca35a6024-logs\") pod \"7013e94a-6bc7-4cda-9577-117ca35a6024\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.590818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-combined-ca-bundle\") pod \"7013e94a-6bc7-4cda-9577-117ca35a6024\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.590916 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-config-data\") pod \"7013e94a-6bc7-4cda-9577-117ca35a6024\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.590963 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8dm7\" (UniqueName: \"kubernetes.io/projected/7013e94a-6bc7-4cda-9577-117ca35a6024-kube-api-access-b8dm7\") pod \"7013e94a-6bc7-4cda-9577-117ca35a6024\" (UID: \"7013e94a-6bc7-4cda-9577-117ca35a6024\") " Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.591786 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7013e94a-6bc7-4cda-9577-117ca35a6024-logs" (OuterVolumeSpecName: "logs") pod "7013e94a-6bc7-4cda-9577-117ca35a6024" (UID: "7013e94a-6bc7-4cda-9577-117ca35a6024"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.604549 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7013e94a-6bc7-4cda-9577-117ca35a6024-kube-api-access-b8dm7" (OuterVolumeSpecName: "kube-api-access-b8dm7") pod "7013e94a-6bc7-4cda-9577-117ca35a6024" (UID: "7013e94a-6bc7-4cda-9577-117ca35a6024"). InnerVolumeSpecName "kube-api-access-b8dm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.639180 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7013e94a-6bc7-4cda-9577-117ca35a6024" (UID: "7013e94a-6bc7-4cda-9577-117ca35a6024"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.655545 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-config-data" (OuterVolumeSpecName: "config-data") pod "7013e94a-6bc7-4cda-9577-117ca35a6024" (UID: "7013e94a-6bc7-4cda-9577-117ca35a6024"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.659563 4749 generic.go:334] "Generic (PLEG): container finished" podID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerID="016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1" exitCode=0 Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.659615 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.659637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7013e94a-6bc7-4cda-9577-117ca35a6024","Type":"ContainerDied","Data":"016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1"} Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.659671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7013e94a-6bc7-4cda-9577-117ca35a6024","Type":"ContainerDied","Data":"edcaf3ebfe9a4f6b7a96929edb8cc41adddd2fb684c564b11c3de65424ed0a56"} Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.659692 4749 scope.go:117] "RemoveContainer" containerID="016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.663785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5792ec0-0d00-47e7-8d9d-d3133cd1e695","Type":"ContainerStarted","Data":"7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910"} Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.675548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7013e94a-6bc7-4cda-9577-117ca35a6024" (UID: "7013e94a-6bc7-4cda-9577-117ca35a6024"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.686426 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.686408735 podStartE2EDuration="2.686408735s" podCreationTimestamp="2025-11-29 01:35:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:53.684925788 +0000 UTC m=+1496.857075655" watchObservedRunningTime="2025-11-29 01:35:53.686408735 +0000 UTC m=+1496.858558592" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.686530 4749 scope.go:117] "RemoveContainer" containerID="bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.693466 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.693513 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8dm7\" (UniqueName: \"kubernetes.io/projected/7013e94a-6bc7-4cda-9577-117ca35a6024-kube-api-access-b8dm7\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.693524 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.693534 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7013e94a-6bc7-4cda-9577-117ca35a6024-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.693545 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7013e94a-6bc7-4cda-9577-117ca35a6024-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.710990 4749 scope.go:117] "RemoveContainer" containerID="016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1" Nov 29 01:35:53 crc kubenswrapper[4749]: E1129 01:35:53.711538 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1\": container with ID starting with 016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1 not found: ID does not exist" containerID="016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.711585 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1"} err="failed to get container status \"016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1\": rpc error: code = NotFound desc = could not find container \"016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1\": container with ID starting with 016cc899ef2e501f9392f1c6c82a3ce5a35f49518090a4a4e8741a10a07f5ef1 not found: ID does not exist" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.711624 4749 scope.go:117] "RemoveContainer" containerID="bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058" Nov 29 01:35:53 crc kubenswrapper[4749]: E1129 01:35:53.711978 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058\": container with ID starting with bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058 not found: ID does not exist" containerID="bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058" Nov 29 01:35:53 crc kubenswrapper[4749]: I1129 01:35:53.712033 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058"} err="failed to get container status \"bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058\": rpc error: code = NotFound desc = could not find container \"bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058\": container with ID starting with bf361db3f3cb4f34a8c897d4f3052d2d017dcfde642db6bcef2f148d9f184058 not found: ID does not exist" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.077107 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.091580 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.106966 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:54 crc kubenswrapper[4749]: E1129 01:35:54.107727 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-metadata" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.107767 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-metadata" Nov 29 01:35:54 crc kubenswrapper[4749]: E1129 01:35:54.107822 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-log" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.107835 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-log" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.108171 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-metadata" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.108235 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" containerName="nova-metadata-log" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.109986 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.113138 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.114363 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.128651 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.208442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z895p\" (UniqueName: \"kubernetes.io/projected/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-kube-api-access-z895p\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.208490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.208544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.208771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-logs\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.208908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-config-data\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.310576 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-config-data\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.310712 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z895p\" (UniqueName: \"kubernetes.io/projected/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-kube-api-access-z895p\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.310805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.310877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.310989 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-logs\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.311534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-logs\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.320061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.321419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.334019 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-config-data\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.335833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z895p\" (UniqueName: \"kubernetes.io/projected/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-kube-api-access-z895p\") pod \"nova-metadata-0\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.430533 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:35:54 crc kubenswrapper[4749]: I1129 01:35:54.973597 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:35:55 crc kubenswrapper[4749]: I1129 01:35:55.089038 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7013e94a-6bc7-4cda-9577-117ca35a6024" path="/var/lib/kubelet/pods/7013e94a-6bc7-4cda-9577-117ca35a6024/volumes" Nov 29 01:35:55 crc kubenswrapper[4749]: I1129 01:35:55.374027 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:35:55 crc kubenswrapper[4749]: I1129 01:35:55.374127 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:35:55 crc kubenswrapper[4749]: I1129 01:35:55.705822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70f9e6e2-7ed0-4977-a558-27fb6a9d4001","Type":"ContainerStarted","Data":"5de820b5dca490141173e58c8807e85dfc695c9b0141c1364e49c90d3e7f477e"} Nov 29 01:35:55 crc kubenswrapper[4749]: I1129 01:35:55.706135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70f9e6e2-7ed0-4977-a558-27fb6a9d4001","Type":"ContainerStarted","Data":"8aabc98b71671061ad9285c52d012f1555e22985b6de832773cf911ed650a0eb"} Nov 29 01:35:55 crc kubenswrapper[4749]: I1129 01:35:55.706153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70f9e6e2-7ed0-4977-a558-27fb6a9d4001","Type":"ContainerStarted","Data":"44de507991c18908d1c9bc05646c35ae0db12206dec2019de193655acb26ec96"} Nov 29 01:35:55 crc kubenswrapper[4749]: I1129 01:35:55.741557 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.741539828 podStartE2EDuration="1.741539828s" podCreationTimestamp="2025-11-29 01:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:35:55.731698283 +0000 UTC m=+1498.903848160" watchObservedRunningTime="2025-11-29 01:35:55.741539828 +0000 UTC m=+1498.913689685" Nov 29 01:35:57 crc kubenswrapper[4749]: I1129 01:35:57.023751 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 01:35:59 crc kubenswrapper[4749]: I1129 01:35:59.431324 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 01:35:59 crc kubenswrapper[4749]: I1129 01:35:59.431996 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 01:36:01 crc kubenswrapper[4749]: I1129 01:36:01.363768 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 01:36:01 crc kubenswrapper[4749]: I1129 01:36:01.363916 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 01:36:02 crc kubenswrapper[4749]: I1129 01:36:02.024052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 01:36:02 crc kubenswrapper[4749]: I1129 01:36:02.071874 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 01:36:02 crc kubenswrapper[4749]: I1129 01:36:02.384594 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 01:36:02 crc kubenswrapper[4749]: I1129 01:36:02.384664 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 01:36:02 crc kubenswrapper[4749]: I1129 01:36:02.855954 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 01:36:04 crc kubenswrapper[4749]: I1129 01:36:04.431478 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 01:36:04 crc kubenswrapper[4749]: I1129 01:36:04.431862 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 01:36:05 crc kubenswrapper[4749]: I1129 01:36:05.451538 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 01:36:05 crc kubenswrapper[4749]: I1129 01:36:05.452038 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 29 01:36:09 crc kubenswrapper[4749]: I1129 01:36:09.899223 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 01:36:11 crc kubenswrapper[4749]: I1129 01:36:11.374490 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 01:36:11 crc kubenswrapper[4749]: I1129 01:36:11.375603 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 01:36:11 crc kubenswrapper[4749]: I1129 01:36:11.378820 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 01:36:11 crc kubenswrapper[4749]: I1129 01:36:11.390336 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 01:36:11 crc kubenswrapper[4749]: I1129 01:36:11.901525 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 01:36:11 crc kubenswrapper[4749]: I1129 01:36:11.912051 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 01:36:14 crc kubenswrapper[4749]: I1129 01:36:14.439355 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 01:36:14 crc kubenswrapper[4749]: I1129 01:36:14.440519 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 01:36:14 crc kubenswrapper[4749]: I1129 01:36:14.447936 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 01:36:14 crc kubenswrapper[4749]: I1129 01:36:14.946077 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 01:36:25 crc kubenswrapper[4749]: I1129 01:36:25.374178 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:36:25 crc kubenswrapper[4749]: I1129 01:36:25.374781 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:36:25 crc kubenswrapper[4749]: I1129 01:36:25.374827 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:36:25 crc kubenswrapper[4749]: I1129 01:36:25.375620 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:36:25 crc kubenswrapper[4749]: I1129 01:36:25.375672 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" gracePeriod=600 Nov 29 01:36:25 crc kubenswrapper[4749]: E1129 01:36:25.498563 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:36:26 crc kubenswrapper[4749]: I1129 01:36:26.071740 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" exitCode=0 Nov 29 01:36:26 crc kubenswrapper[4749]: I1129 01:36:26.071810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0"} Nov 29 01:36:26 crc kubenswrapper[4749]: I1129 01:36:26.071850 4749 scope.go:117] "RemoveContainer" containerID="78a46fc2167fe8c7a63102b9ca82268fb546ed6ba88ac6008b9b767e900c3b97" Nov 29 01:36:26 crc kubenswrapper[4749]: I1129 01:36:26.072875 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:36:26 crc kubenswrapper[4749]: E1129 01:36:26.073577 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:36:34 crc kubenswrapper[4749]: I1129 01:36:34.971261 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 29 01:36:34 crc kubenswrapper[4749]: I1129 01:36:34.972104 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="5b102261-e51d-4b92-a267-167f1ffd0a41" containerName="openstackclient" containerID="cri-o://b48098c5e4956d09db8ec03faedfc1ef2c22fcc5a22369e6385279648a81cfce" gracePeriod=2 Nov 29 01:36:34 crc kubenswrapper[4749]: I1129 01:36:34.988402 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.018869 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance261d-account-delete-pndlc"] Nov 29 01:36:35 crc kubenswrapper[4749]: E1129 01:36:35.019326 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b102261-e51d-4b92-a267-167f1ffd0a41" containerName="openstackclient" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.019340 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b102261-e51d-4b92-a267-167f1ffd0a41" containerName="openstackclient" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.019559 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b102261-e51d-4b92-a267-167f1ffd0a41" containerName="openstackclient" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.020225 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.077626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance261d-account-delete-pndlc"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.112846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89lml\" (UniqueName: \"kubernetes.io/projected/132885d9-92b8-4cf2-b8e2-7f365fd0d020-kube-api-access-89lml\") pod \"glance261d-account-delete-pndlc\" (UID: \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\") " pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.112933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/132885d9-92b8-4cf2-b8e2-7f365fd0d020-operator-scripts\") pod \"glance261d-account-delete-pndlc\" (UID: \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\") " pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.187149 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mz2m2"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.232262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89lml\" (UniqueName: \"kubernetes.io/projected/132885d9-92b8-4cf2-b8e2-7f365fd0d020-kube-api-access-89lml\") pod \"glance261d-account-delete-pndlc\" (UID: \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\") " pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.248746 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/132885d9-92b8-4cf2-b8e2-7f365fd0d020-operator-scripts\") pod \"glance261d-account-delete-pndlc\" (UID: \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\") " pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.255687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/132885d9-92b8-4cf2-b8e2-7f365fd0d020-operator-scripts\") pod \"glance261d-account-delete-pndlc\" (UID: \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\") " pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.259264 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mz2m2"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.319802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89lml\" (UniqueName: \"kubernetes.io/projected/132885d9-92b8-4cf2-b8e2-7f365fd0d020-kube-api-access-89lml\") pod \"glance261d-account-delete-pndlc\" (UID: \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\") " pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.387634 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.387867 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="ovn-northd" containerID="cri-o://32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" gracePeriod=30 Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.388289 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="openstack-network-exporter" containerID="cri-o://54a83438d867dacb43afd491866da3b4e42fc03fc4d37a58d1393a675b29c458" gracePeriod=30 Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.393294 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.421501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.431972 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cindere578-account-delete-x9bh8"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.433527 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.452646 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cindere578-account-delete-x9bh8"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.479347 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.479990 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d619b935-7717-4e88-af76-97e946d3cef5" containerName="openstack-network-exporter" containerID="cri-o://d0569384a350f4295e192fcee3f98b3e3e7b0e1e279219472a2ca0815c60cdf2" gracePeriod=300 Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.495932 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican7ea4-account-delete-hg54s"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.504924 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.547105 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican7ea4-account-delete-hg54s"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.569018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-operator-scripts\") pod \"cindere578-account-delete-x9bh8\" (UID: \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\") " pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.569064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54xx2\" (UniqueName: \"kubernetes.io/projected/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-kube-api-access-54xx2\") pod \"cindere578-account-delete-x9bh8\" (UID: \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\") " pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:35 crc kubenswrapper[4749]: E1129 01:36:35.569377 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:35 crc kubenswrapper[4749]: E1129 01:36:35.569423 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data podName:9a9603fe-72d8-479a-86be-9b914455fba1 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:36.069405877 +0000 UTC m=+1539.241555724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data") pod "rabbitmq-cell1-server-0" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1") : configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.645311 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement7db1-account-delete-x9f9k"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.646676 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.653450 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d619b935-7717-4e88-af76-97e946d3cef5" containerName="ovsdbserver-nb" containerID="cri-o://3dc57790e819b5c2fd7d24841a617fcfe2253270800c29923a0f8a3796f1a9a5" gracePeriod=300 Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.668405 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement7db1-account-delete-x9f9k"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.670397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-operator-scripts\") pod \"barbican7ea4-account-delete-hg54s\" (UID: \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\") " pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.670499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-operator-scripts\") pod \"cindere578-account-delete-x9bh8\" (UID: \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\") " pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.670520 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54xx2\" (UniqueName: \"kubernetes.io/projected/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-kube-api-access-54xx2\") pod \"cindere578-account-delete-x9bh8\" (UID: \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\") " pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.670619 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx74p\" (UniqueName: \"kubernetes.io/projected/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-kube-api-access-xx74p\") pod \"barbican7ea4-account-delete-hg54s\" (UID: \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\") " pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.674861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-operator-scripts\") pod \"cindere578-account-delete-x9bh8\" (UID: \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\") " pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.685236 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rj68d"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.730182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54xx2\" (UniqueName: \"kubernetes.io/projected/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-kube-api-access-54xx2\") pod \"cindere578-account-delete-x9bh8\" (UID: \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\") " pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.768955 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rj68d"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.772794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx74p\" (UniqueName: \"kubernetes.io/projected/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-kube-api-access-xx74p\") pod \"barbican7ea4-account-delete-hg54s\" (UID: \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\") " pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.772832 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a220429-0482-4578-9268-d4127f8da9af-operator-scripts\") pod \"placement7db1-account-delete-x9f9k\" (UID: \"7a220429-0482-4578-9268-d4127f8da9af\") " pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.772862 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-operator-scripts\") pod \"barbican7ea4-account-delete-hg54s\" (UID: \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\") " pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.772893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8mf\" (UniqueName: \"kubernetes.io/projected/7a220429-0482-4578-9268-d4127f8da9af-kube-api-access-rj8mf\") pod \"placement7db1-account-delete-x9f9k\" (UID: \"7a220429-0482-4578-9268-d4127f8da9af\") " pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.779887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-operator-scripts\") pod \"barbican7ea4-account-delete-hg54s\" (UID: \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\") " pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.814335 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.834376 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron5123-account-delete-qmfnv"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.835608 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.841304 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron5123-account-delete-qmfnv"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.863488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx74p\" (UniqueName: \"kubernetes.io/projected/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-kube-api-access-xx74p\") pod \"barbican7ea4-account-delete-hg54s\" (UID: \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\") " pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.877574 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-x6f5q"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.885889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a220429-0482-4578-9268-d4127f8da9af-operator-scripts\") pod \"placement7db1-account-delete-x9f9k\" (UID: \"7a220429-0482-4578-9268-d4127f8da9af\") " pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.886099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8mf\" (UniqueName: \"kubernetes.io/projected/7a220429-0482-4578-9268-d4127f8da9af-kube-api-access-rj8mf\") pod \"placement7db1-account-delete-x9f9k\" (UID: \"7a220429-0482-4578-9268-d4127f8da9af\") " pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.887025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a220429-0482-4578-9268-d4127f8da9af-operator-scripts\") pod \"placement7db1-account-delete-x9f9k\" (UID: \"7a220429-0482-4578-9268-d4127f8da9af\") " pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.891700 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-x6f5q"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.952990 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8mf\" (UniqueName: \"kubernetes.io/projected/7a220429-0482-4578-9268-d4127f8da9af-kube-api-access-rj8mf\") pod \"placement7db1-account-delete-x9f9k\" (UID: \"7a220429-0482-4578-9268-d4127f8da9af\") " pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.953258 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jzclg"] Nov 29 01:36:35 crc kubenswrapper[4749]: I1129 01:36:35.998939 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-jzclg"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:35.999661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q26h\" (UniqueName: \"kubernetes.io/projected/d5134b41-98a9-4555-9adb-c988862f59e6-kube-api-access-7q26h\") pod \"neutron5123-account-delete-qmfnv\" (UID: \"d5134b41-98a9-4555-9adb-c988862f59e6\") " pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:35.999812 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts\") pod \"neutron5123-account-delete-qmfnv\" (UID: \"d5134b41-98a9-4555-9adb-c988862f59e6\") " pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.040314 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.041116 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerName="glance-httpd" containerID="cri-o://81bb8b04fc234401df04769295fca821df1009b777f50ebd985ac7c880a1e11d" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.041012 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerName="glance-log" containerID="cri-o://f72654b1f6476b877d4a3b3f9bd36b7ac0424fa98f947f3608445049745b170c" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.064228 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.064752 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-server" containerID="cri-o://019a47cff718db2d0002a6650c9afb4a966ae40b1aef5de5b8fac16e7100d973" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065128 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="swift-recon-cron" containerID="cri-o://894dac0e55c5b8303d062dc9a73ff359863207502ac50a567d5a516b5044255f" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065228 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="rsync" containerID="cri-o://efa94720f5ca79ab7d9121540b501d8e5d9310c95e8baf7e219e6f4cee72dadd" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065269 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-expirer" containerID="cri-o://280811dcf44a523f325934a8c2570ea8ae0d344113018439d508fa6efc5324ec" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065309 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-updater" containerID="cri-o://a78d3a0a66909398264af9304c9055f1ddb5c500bf5846177971f56a8ebc2113" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065343 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-auditor" containerID="cri-o://e9c55bc5cd269128e765337cd53c4eb2aa665fc0070c9dbb2cddb9feb42c9d56" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065389 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-replicator" containerID="cri-o://2153b28d7b1ae703a650e64e126179253f7846999b9a6400cfd009a599bbb246" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065421 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-server" containerID="cri-o://bfafc82b272a3145020f82bc16f80a2708db4958f657da2553e53da41914e8a6" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065448 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-updater" containerID="cri-o://9f220385e1bef364c02962b269f0e2db7e6230aaf50d37f52f736cd72a640af1" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065479 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-auditor" containerID="cri-o://c135e668a9588d4d539c1f9ecce850efa84d0c9cf2ace29a444e0f2d6c4d4e1d" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065513 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-replicator" containerID="cri-o://5cd4b93f19f43d8c22124c663685c2a5b7c3218c0e81bb3c102046e182dec0d8" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065548 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-server" containerID="cri-o://e391c22d2e560a1daaa903407899dba8b65a96672ec1d83469d63ccacf47014a" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065587 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-reaper" containerID="cri-o://c6544b18ba1c97f83dddee9f06c12698f0b180173fc59826441ce3ebe0d76ccb" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065612 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-auditor" containerID="cri-o://f9bfeddd31df51ff31ba3def5c4f3f2e8ba2fc1efc9f023e76e32fba61e40263" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.065643 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-replicator" containerID="cri-o://e28b38f1e7521875f74819f009a7406efa1e468522d52a3aae45ded543cc2908" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.113616 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-s97ck"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.139661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.141098 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-s97ck"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.185786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q26h\" (UniqueName: \"kubernetes.io/projected/d5134b41-98a9-4555-9adb-c988862f59e6-kube-api-access-7q26h\") pod \"neutron5123-account-delete-qmfnv\" (UID: \"d5134b41-98a9-4555-9adb-c988862f59e6\") " pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.185950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts\") pod \"neutron5123-account-delete-qmfnv\" (UID: \"d5134b41-98a9-4555-9adb-c988862f59e6\") " pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.188252 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.188305 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data podName:9a9603fe-72d8-479a-86be-9b914455fba1 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:37.188293267 +0000 UTC m=+1540.360443114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data") pod "rabbitmq-cell1-server-0" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1") : configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.189327 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts\") pod \"neutron5123-account-delete-qmfnv\" (UID: \"d5134b41-98a9-4555-9adb-c988862f59e6\") " pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.272674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.300598 4749 generic.go:334] "Generic (PLEG): container finished" podID="c639d859-841e-4f38-a2b3-09fc3201e616" containerID="54a83438d867dacb43afd491866da3b4e42fc03fc4d37a58d1393a675b29c458" exitCode=2 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.300682 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c639d859-841e-4f38-a2b3-09fc3201e616","Type":"ContainerDied","Data":"54a83438d867dacb43afd491866da3b4e42fc03fc4d37a58d1393a675b29c458"} Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.305334 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rxq45"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.305593 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" podUID="6565857b-6329-46a9-b8c0-4cad1019c4b9" containerName="dnsmasq-dns" containerID="cri-o://f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9" gracePeriod=10 Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.485507 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.490873 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d619b935-7717-4e88-af76-97e946d3cef5/ovsdbserver-nb/0.log" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.490917 4749 generic.go:334] "Generic (PLEG): container finished" podID="d619b935-7717-4e88-af76-97e946d3cef5" containerID="d0569384a350f4295e192fcee3f98b3e3e7b0e1e279219472a2ca0815c60cdf2" exitCode=2 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.490934 4749 generic.go:334] "Generic (PLEG): container finished" podID="d619b935-7717-4e88-af76-97e946d3cef5" containerID="3dc57790e819b5c2fd7d24841a617fcfe2253270800c29923a0f8a3796f1a9a5" exitCode=143 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.490984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d619b935-7717-4e88-af76-97e946d3cef5","Type":"ContainerDied","Data":"d0569384a350f4295e192fcee3f98b3e3e7b0e1e279219472a2ca0815c60cdf2"} Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.491010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d619b935-7717-4e88-af76-97e946d3cef5","Type":"ContainerDied","Data":"3dc57790e819b5c2fd7d24841a617fcfe2253270800c29923a0f8a3796f1a9a5"} Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.549758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q26h\" (UniqueName: \"kubernetes.io/projected/d5134b41-98a9-4555-9adb-c988862f59e6-kube-api-access-7q26h\") pod \"neutron5123-account-delete-qmfnv\" (UID: \"d5134b41-98a9-4555-9adb-c988862f59e6\") " pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.591172 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.591640 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-log" containerID="cri-o://81a360b2af0a80524dd327e6ff413357d3ae93289f88811a965e85a9ada36073" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.592053 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-httpd" containerID="cri-o://60d404c089766276e2b776b298e5f3050db4dd04153458e59685a427fe248694" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.648401 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.714854 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.715129 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="ovn-northd" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.730436 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6zwjh"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.730653 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-6zwjh" podUID="9e2ea248-43ad-4851-96ea-3e6adaba3ef0" containerName="openstack-network-exporter" containerID="cri-o://23d0cb96cf04c81af3bc6f2003ca35fb96d25512a2dad19539c520ac601aba5c" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.809707 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1089c-account-delete-d5r26"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.811567 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.829423 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qtf89"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.834726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.834784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v46xk\" (UniqueName: \"kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.850769 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qtf89"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.870775 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9lxvg"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.920383 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1089c-account-delete-d5r26"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.942418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.942586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46xk\" (UniqueName: \"kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.942521 4749 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.943237 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts podName:eb4218c5-c97a-4e5d-8d70-28bc3d763d8a nodeName:}" failed. No retries permitted until 2025-11-29 01:36:37.44318392 +0000 UTC m=+1540.615333777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts") pod "novacell1089c-account-delete-d5r26" (UID: "eb4218c5-c97a-4e5d-8d70-28bc3d763d8a") : configmap "openstack-cell1-scripts" not found Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.948298 4749 projected.go:194] Error preparing data for projected volume kube-api-access-v46xk for pod openstack/novacell1089c-account-delete-d5r26: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.948406 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk podName:eb4218c5-c97a-4e5d-8d70-28bc3d763d8a nodeName:}" failed. No retries permitted until 2025-11-29 01:36:37.448390449 +0000 UTC m=+1540.620540296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-v46xk" (UniqueName: "kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk") pod "novacell1089c-account-delete-d5r26" (UID: "eb4218c5-c97a-4e5d-8d70-28bc3d763d8a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.951890 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-2k6f9"] Nov 29 01:36:36 crc kubenswrapper[4749]: E1129 01:36:36.952169 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd619b935_7717_4e88_af76_97e946d3cef5.slice/crio-conmon-d0569384a350f4295e192fcee3f98b3e3e7b0e1e279219472a2ca0815c60cdf2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd619b935_7717_4e88_af76_97e946d3cef5.slice/crio-d0569384a350f4295e192fcee3f98b3e3e7b0e1e279219472a2ca0815c60cdf2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc639d859_841e_4f38_a2b3_09fc3201e616.slice/crio-54a83438d867dacb43afd491866da3b4e42fc03fc4d37a58d1393a675b29c458.scope\": RecentStats: unable to find data in memory cache]" Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.969412 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.969642 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerName="cinder-scheduler" containerID="cri-o://f743ba6ff8b1da63b560049f4030c6a63de306ec90e8671821fcc72e7725c52a" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.971253 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerName="probe" containerID="cri-o://5ed421f49bc336804d73a5ee8923bd091bdb89ba7051efb3930f5c62c2dfef03" gracePeriod=30 Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.976612 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 01:36:36 crc kubenswrapper[4749]: I1129 01:36:36.981033 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerName="openstack-network-exporter" containerID="cri-o://6e9657e12a54da3db09ff22f9ac8c9eb8a67c2424dfe7eae2c12bb076d0ac942" gracePeriod=300 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.009655 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi3ebb-account-delete-5jgft"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.011821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.027138 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell08947-account-delete-x2bgc"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.031721 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.044226 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zh56\" (UniqueName: \"kubernetes.io/projected/8cb80122-cbde-418d-8f3f-367087068603-kube-api-access-8zh56\") pod \"novaapi3ebb-account-delete-5jgft\" (UID: \"8cb80122-cbde-418d-8f3f-367087068603\") " pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.044308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts\") pod \"novaapi3ebb-account-delete-5jgft\" (UID: \"8cb80122-cbde-418d-8f3f-367087068603\") " pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.049245 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.049489 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api-log" containerID="cri-o://eced19490e815516b429f396da09f8236d99a0db8fbad5ff60fe95e770b29786" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.049614 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api" containerID="cri-o://0957fdee810eb3eb8c31f26782750e24fed0ee07cdb3eb0ee23af55d2e009a5c" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.066272 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi3ebb-account-delete-5jgft"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.141893 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab7557b-b040-450a-b0ad-437720fab3a2" path="/var/lib/kubelet/pods/2ab7557b-b040-450a-b0ad-437720fab3a2/volumes" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.142648 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="655fb5be-d1d0-4e82-bc40-f76dd4ddb133" path="/var/lib/kubelet/pods/655fb5be-d1d0-4e82-bc40-f76dd4ddb133/volumes" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.143216 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c154a8-70f6-41c6-9040-bfaea3b6caf1" path="/var/lib/kubelet/pods/66c154a8-70f6-41c6-9040-bfaea3b6caf1/volumes" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.145443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts\") pod \"novacell08947-account-delete-x2bgc\" (UID: \"cd7300ea-27b0-438e-981c-7b862054e630\") " pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.147088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zh56\" (UniqueName: \"kubernetes.io/projected/8cb80122-cbde-418d-8f3f-367087068603-kube-api-access-8zh56\") pod \"novaapi3ebb-account-delete-5jgft\" (UID: \"8cb80122-cbde-418d-8f3f-367087068603\") " pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.147356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts\") pod \"novaapi3ebb-account-delete-5jgft\" (UID: \"8cb80122-cbde-418d-8f3f-367087068603\") " pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.147469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbkkp\" (UniqueName: \"kubernetes.io/projected/cd7300ea-27b0-438e-981c-7b862054e630-kube-api-access-sbkkp\") pod \"novacell08947-account-delete-x2bgc\" (UID: \"cd7300ea-27b0-438e-981c-7b862054e630\") " pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.149254 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts\") pod \"novaapi3ebb-account-delete-5jgft\" (UID: \"8cb80122-cbde-418d-8f3f-367087068603\") " pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.164889 4749 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-9lxvg" message="Exiting ovn-controller (1) " Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.164932 4749 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-9lxvg" podUID="65fd8520-689b-4f93-850e-bac0cec97025" containerName="ovn-controller" containerID="cri-o://da6648012c83a6cc74d32fcd946afbf2e43e17fe059785d2a3c1297c4a85a109" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.164963 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-9lxvg" podUID="65fd8520-689b-4f93-850e-bac0cec97025" containerName="ovn-controller" containerID="cri-o://da6648012c83a6cc74d32fcd946afbf2e43e17fe059785d2a3c1297c4a85a109" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.181455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zh56\" (UniqueName: \"kubernetes.io/projected/8cb80122-cbde-418d-8f3f-367087068603-kube-api-access-8zh56\") pod \"novaapi3ebb-account-delete-5jgft\" (UID: \"8cb80122-cbde-418d-8f3f-367087068603\") " pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.185156 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a76ca241-80b1-4019-b42a-12ecc908016c" path="/var/lib/kubelet/pods/a76ca241-80b1-4019-b42a-12ecc908016c/volumes" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.202952 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef3ffdde-1c21-47d4-9a07-a84008344e08" path="/var/lib/kubelet/pods/ef3ffdde-1c21-47d4-9a07-a84008344e08/volumes" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.203609 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9b28632-689c-484b-91b4-c57e9d67a6cf" path="/var/lib/kubelet/pods/f9b28632-689c-484b-91b4-c57e9d67a6cf/volumes" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.207613 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerName="ovsdbserver-sb" containerID="cri-o://54b419e178792015e0c2b8fac863bb01960334b0e71784b51c9825ddfc510195" gracePeriod=300 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.220593 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell08947-account-delete-x2bgc"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.220637 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-hdhbv"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.250882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbkkp\" (UniqueName: \"kubernetes.io/projected/cd7300ea-27b0-438e-981c-7b862054e630-kube-api-access-sbkkp\") pod \"novacell08947-account-delete-x2bgc\" (UID: \"cd7300ea-27b0-438e-981c-7b862054e630\") " pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.250957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts\") pod \"novacell08947-account-delete-x2bgc\" (UID: \"cd7300ea-27b0-438e-981c-7b862054e630\") " pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.253220 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.253271 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data podName:9a9603fe-72d8-479a-86be-9b914455fba1 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:39.25325599 +0000 UTC m=+1542.425405847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data") pod "rabbitmq-cell1-server-0" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1") : configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.253816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts\") pod \"novacell08947-account-delete-x2bgc\" (UID: \"cd7300ea-27b0-438e-981c-7b862054e630\") " pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.255403 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-hdhbv"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.262408 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78bc78f9d8-g85sc"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.263150 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78bc78f9d8-g85sc" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-log" containerID="cri-o://e944097794538d3c014641110ebd81f38c1dcad22ea2f4aa4c69b522c4b836ee" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.264014 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78bc78f9d8-g85sc" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-api" containerID="cri-o://e2aab48732b202c3e72e85886215cd2e441185bbc74a2b999dc3bff8161959f4" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.273423 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7pbnc"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.280405 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7pbnc"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.289565 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.296174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbkkp\" (UniqueName: \"kubernetes.io/projected/cd7300ea-27b0-438e-981c-7b862054e630-kube-api-access-sbkkp\") pod \"novacell08947-account-delete-x2bgc\" (UID: \"cd7300ea-27b0-438e-981c-7b862054e630\") " pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.303115 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75b4fcb8ff-s4n9j"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.303380 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75b4fcb8ff-s4n9j" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-api" containerID="cri-o://87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.303794 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-75b4fcb8ff-s4n9j" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-httpd" containerID="cri-o://12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.309723 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-744d76c7bb-6xh5q"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.310106 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-744d76c7bb-6xh5q" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api-log" containerID="cri-o://02e31a093f3745f70d2f114e3071dcc2ff626d6fa46cc35da2f370cfb31b8cce" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.310490 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-744d76c7bb-6xh5q" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api" containerID="cri-o://405921777b6096485df779e9382966efbbcbdbe565e378910a31ae009c151eab" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.319314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.321743 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-56469f8b8-ckfjz"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.321935 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerName="barbican-keystone-listener-log" containerID="cri-o://d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.322839 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerName="barbican-keystone-listener" containerID="cri-o://3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.349276 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b8cf6c56c-s8qz2"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.349737 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerName="barbican-worker-log" containerID="cri-o://23c1f9a9695d62882d05b68f2a362774fd211e09dcdb543ab2bb762d27b58e29" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.350177 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerName="barbican-worker" containerID="cri-o://ac6b10cbf89933157f87ce0832e498f620cc616f9e2d34fbbac8faa2f0a1cdbd" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.360712 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.360761 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data podName:31a44203-fd94-4eb4-952f-d54a5c577095 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:37.860747596 +0000 UTC m=+1541.032897453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data") pod "rabbitmq-server-0" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095") : configmap "rabbitmq-config-data" not found Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.365259 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.407561 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.407808 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-log" containerID="cri-o://2156bc219030d070a0c4da6cd4ca235e2905d2b767d2af6d322ac73d528b20d1" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.407946 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-api" containerID="cri-o://42f0d632bd584d2a9e5b8e0e0d9b68c2c2cb9b1edeb4d0f8370b7e18d29880ce" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.425331 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.426098 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-log" containerID="cri-o://8aabc98b71671061ad9285c52d012f1555e22985b6de832773cf911ed650a0eb" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.431094 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-metadata" containerID="cri-o://5de820b5dca490141173e58c8807e85dfc695c9b0141c1364e49c90d3e7f477e" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.440574 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-cbs5l"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.457942 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-cbs5l"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.460982 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.461042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46xk\" (UniqueName: \"kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.461969 4749 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.462054 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts podName:eb4218c5-c97a-4e5d-8d70-28bc3d763d8a nodeName:}" failed. No retries permitted until 2025-11-29 01:36:38.462013558 +0000 UTC m=+1541.634163415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts") pod "novacell1089c-account-delete-d5r26" (UID: "eb4218c5-c97a-4e5d-8d70-28bc3d763d8a") : configmap "openstack-cell1-scripts" not found Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.465329 4749 projected.go:194] Error preparing data for projected volume kube-api-access-v46xk for pod openstack/novacell1089c-account-delete-d5r26: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.465394 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk podName:eb4218c5-c97a-4e5d-8d70-28bc3d763d8a nodeName:}" failed. No retries permitted until 2025-11-29 01:36:38.465378471 +0000 UTC m=+1541.637528328 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-v46xk" (UniqueName: "kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk") pod "novacell1089c-account-delete-d5r26" (UID: "eb4218c5-c97a-4e5d-8d70-28bc3d763d8a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.536125 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4a087f05-8b7d-4207-88e8-1c622d57c653/ovsdbserver-sb/0.log" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.536181 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerID="6e9657e12a54da3db09ff22f9ac8c9eb8a67c2424dfe7eae2c12bb076d0ac942" exitCode=2 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.536299 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerID="54b419e178792015e0c2b8fac863bb01960334b0e71784b51c9825ddfc510195" exitCode=143 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.536351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a087f05-8b7d-4207-88e8-1c622d57c653","Type":"ContainerDied","Data":"6e9657e12a54da3db09ff22f9ac8c9eb8a67c2424dfe7eae2c12bb076d0ac942"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.536379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a087f05-8b7d-4207-88e8-1c622d57c653","Type":"ContainerDied","Data":"54b419e178792015e0c2b8fac863bb01960334b0e71784b51c9825ddfc510195"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.538082 4749 generic.go:334] "Generic (PLEG): container finished" podID="5b102261-e51d-4b92-a267-167f1ffd0a41" containerID="b48098c5e4956d09db8ec03faedfc1ef2c22fcc5a22369e6385279648a81cfce" exitCode=137 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.541989 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d619b935-7717-4e88-af76-97e946d3cef5/ovsdbserver-nb/0.log" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.542071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d619b935-7717-4e88-af76-97e946d3cef5","Type":"ContainerDied","Data":"0b1fc67c499e90d71161ea778582adf3bde479858063b4d20a377e392fe03f73"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.542098 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b1fc67c499e90d71161ea778582adf3bde479858063b4d20a377e392fe03f73" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.544935 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6zwjh_9e2ea248-43ad-4851-96ea-3e6adaba3ef0/openstack-network-exporter/0.log" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.544981 4749 generic.go:334] "Generic (PLEG): container finished" podID="9e2ea248-43ad-4851-96ea-3e6adaba3ef0" containerID="23d0cb96cf04c81af3bc6f2003ca35fb96d25512a2dad19539c520ac601aba5c" exitCode=2 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.545044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6zwjh" event={"ID":"9e2ea248-43ad-4851-96ea-3e6adaba3ef0","Type":"ContainerDied","Data":"23d0cb96cf04c81af3bc6f2003ca35fb96d25512a2dad19539c520ac601aba5c"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.549259 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerID="f72654b1f6476b877d4a3b3f9bd36b7ac0424fa98f947f3608445049745b170c" exitCode=143 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.549332 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff73098b-8f03-43ac-9e1d-3ac7edd2589d","Type":"ContainerDied","Data":"f72654b1f6476b877d4a3b3f9bd36b7ac0424fa98f947f3608445049745b170c"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.583832 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.595405 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.616270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.625545 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-089c-account-create-update-cxbw6"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.635604 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-089c-account-create-update-cxbw6"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.649691 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="efa94720f5ca79ab7d9121540b501d8e5d9310c95e8baf7e219e6f4cee72dadd" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.649720 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="280811dcf44a523f325934a8c2570ea8ae0d344113018439d508fa6efc5324ec" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.649727 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="a78d3a0a66909398264af9304c9055f1ddb5c500bf5846177971f56a8ebc2113" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.649733 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="e9c55bc5cd269128e765337cd53c4eb2aa665fc0070c9dbb2cddb9feb42c9d56" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.649740 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="2153b28d7b1ae703a650e64e126179253f7846999b9a6400cfd009a599bbb246" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.649748 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="bfafc82b272a3145020f82bc16f80a2708db4958f657da2553e53da41914e8a6" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.649756 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="9f220385e1bef364c02962b269f0e2db7e6230aaf50d37f52f736cd72a640af1" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650100 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"efa94720f5ca79ab7d9121540b501d8e5d9310c95e8baf7e219e6f4cee72dadd"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"280811dcf44a523f325934a8c2570ea8ae0d344113018439d508fa6efc5324ec"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"a78d3a0a66909398264af9304c9055f1ddb5c500bf5846177971f56a8ebc2113"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"e9c55bc5cd269128e765337cd53c4eb2aa665fc0070c9dbb2cddb9feb42c9d56"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"2153b28d7b1ae703a650e64e126179253f7846999b9a6400cfd009a599bbb246"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"bfafc82b272a3145020f82bc16f80a2708db4958f657da2553e53da41914e8a6"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650209 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"9f220385e1bef364c02962b269f0e2db7e6230aaf50d37f52f736cd72a640af1"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650218 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"c135e668a9588d4d539c1f9ecce850efa84d0c9cf2ace29a444e0f2d6c4d4e1d"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650583 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="c135e668a9588d4d539c1f9ecce850efa84d0c9cf2ace29a444e0f2d6c4d4e1d" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650621 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="5cd4b93f19f43d8c22124c663685c2a5b7c3218c0e81bb3c102046e182dec0d8" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650627 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="e391c22d2e560a1daaa903407899dba8b65a96672ec1d83469d63ccacf47014a" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650634 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="c6544b18ba1c97f83dddee9f06c12698f0b180173fc59826441ce3ebe0d76ccb" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650639 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="f9bfeddd31df51ff31ba3def5c4f3f2e8ba2fc1efc9f023e76e32fba61e40263" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650646 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="e28b38f1e7521875f74819f009a7406efa1e468522d52a3aae45ded543cc2908" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650652 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="019a47cff718db2d0002a6650c9afb4a966ae40b1aef5de5b8fac16e7100d973" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"5cd4b93f19f43d8c22124c663685c2a5b7c3218c0e81bb3c102046e182dec0d8"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"e391c22d2e560a1daaa903407899dba8b65a96672ec1d83469d63ccacf47014a"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"c6544b18ba1c97f83dddee9f06c12698f0b180173fc59826441ce3ebe0d76ccb"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"f9bfeddd31df51ff31ba3def5c4f3f2e8ba2fc1efc9f023e76e32fba61e40263"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"e28b38f1e7521875f74819f009a7406efa1e468522d52a3aae45ded543cc2908"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.650787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"019a47cff718db2d0002a6650c9afb4a966ae40b1aef5de5b8fac16e7100d973"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.656499 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1089c-account-delete-d5r26"] Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.657399 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-v46xk operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/novacell1089c-account-delete-d5r26" podUID="eb4218c5-c97a-4e5d-8d70-28bc3d763d8a" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.657401 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" containerName="rabbitmq" containerID="cri-o://00208c9ed91f795bee62848698e8c46182c049d63a9adf626a2bc73dc90a56e8" gracePeriod=604800 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.661123 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.661311 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="26a4b5e6-f82a-4316-a7e8-d596136086c2" containerName="nova-cell0-conductor-conductor" containerID="cri-o://b451af15ca02cf0e5e8a766761169627777887698c277c719a9b70488450c040" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.674091 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.676880 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8f2xf"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.677537 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d619b935-7717-4e88-af76-97e946d3cef5/ovsdbserver-nb/0.log" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.677596 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.689724 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-8f2xf"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.691274 4749 generic.go:334] "Generic (PLEG): container finished" podID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerID="eced19490e815516b429f396da09f8236d99a0db8fbad5ff60fe95e770b29786" exitCode=143 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.691342 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8715ecba-f08b-4fcd-b129-d9e9c568e087","Type":"ContainerDied","Data":"eced19490e815516b429f396da09f8236d99a0db8fbad5ff60fe95e770b29786"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.702254 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f6pkx"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.702854 4749 generic.go:334] "Generic (PLEG): container finished" podID="6565857b-6329-46a9-b8c0-4cad1019c4b9" containerID="f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.702931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" event={"ID":"6565857b-6329-46a9-b8c0-4cad1019c4b9","Type":"ContainerDied","Data":"f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.702959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" event={"ID":"6565857b-6329-46a9-b8c0-4cad1019c4b9","Type":"ContainerDied","Data":"46d703c25d7c0e36de462abdba7a4eec4f3f242271be53b25bfc3c62c5458709"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.702975 4749 scope.go:117] "RemoveContainer" containerID="f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.703090 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-rxq45" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.715025 4749 generic.go:334] "Generic (PLEG): container finished" podID="65fd8520-689b-4f93-850e-bac0cec97025" containerID="da6648012c83a6cc74d32fcd946afbf2e43e17fe059785d2a3c1297c4a85a109" exitCode=0 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.715081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg" event={"ID":"65fd8520-689b-4f93-850e-bac0cec97025","Type":"ContainerDied","Data":"da6648012c83a6cc74d32fcd946afbf2e43e17fe059785d2a3c1297c4a85a109"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.716010 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.716186 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="1bc6d8b8-4291-4f54-8bb2-508933b39c5a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.727033 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-f6pkx"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.728232 4749 generic.go:334] "Generic (PLEG): container finished" podID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerID="81a360b2af0a80524dd327e6ff413357d3ae93289f88811a965e85a9ada36073" exitCode=143 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.728295 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8","Type":"ContainerDied","Data":"81a360b2af0a80524dd327e6ff413357d3ae93289f88811a965e85a9ada36073"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.736172 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.747468 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.747679 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f5792ec0-0d00-47e7-8d9d-d3133cd1e695" containerName="nova-scheduler-scheduler" containerID="cri-o://7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.748358 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="823cce34-3656-4edf-9197-5586262263ec" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5efba0e169acdd879816caeca72282b7bac80064e6baf6bb9765579a9239ea9d" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.751919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance261d-account-delete-pndlc"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770643 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-scripts\") pod \"d619b935-7717-4e88-af76-97e946d3cef5\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d619b935-7717-4e88-af76-97e946d3cef5-ovsdb-rundir\") pod \"d619b935-7717-4e88-af76-97e946d3cef5\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-combined-ca-bundle\") pod \"d619b935-7717-4e88-af76-97e946d3cef5\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770761 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-svc\") pod \"6565857b-6329-46a9-b8c0-4cad1019c4b9\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-nb\") pod \"6565857b-6329-46a9-b8c0-4cad1019c4b9\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxbvc\" (UniqueName: \"kubernetes.io/projected/d619b935-7717-4e88-af76-97e946d3cef5-kube-api-access-sxbvc\") pod \"d619b935-7717-4e88-af76-97e946d3cef5\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-config\") pod \"6565857b-6329-46a9-b8c0-4cad1019c4b9\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770890 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-config\") pod \"d619b935-7717-4e88-af76-97e946d3cef5\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-metrics-certs-tls-certs\") pod \"d619b935-7717-4e88-af76-97e946d3cef5\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770980 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmq9k\" (UniqueName: \"kubernetes.io/projected/6565857b-6329-46a9-b8c0-4cad1019c4b9-kube-api-access-pmq9k\") pod \"6565857b-6329-46a9-b8c0-4cad1019c4b9\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.770997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d619b935-7717-4e88-af76-97e946d3cef5\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.771030 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-swift-storage-0\") pod \"6565857b-6329-46a9-b8c0-4cad1019c4b9\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.771080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-ovsdbserver-nb-tls-certs\") pod \"d619b935-7717-4e88-af76-97e946d3cef5\" (UID: \"d619b935-7717-4e88-af76-97e946d3cef5\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.771104 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-sb\") pod \"6565857b-6329-46a9-b8c0-4cad1019c4b9\" (UID: \"6565857b-6329-46a9-b8c0-4cad1019c4b9\") " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.771351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance261d-account-delete-pndlc" event={"ID":"132885d9-92b8-4cf2-b8e2-7f365fd0d020","Type":"ContainerStarted","Data":"f0c806610465678d9c16dad7e18f9792372a73b28f94f676f0104cfd51a81a20"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.772835 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-scripts" (OuterVolumeSpecName: "scripts") pod "d619b935-7717-4e88-af76-97e946d3cef5" (UID: "d619b935-7717-4e88-af76-97e946d3cef5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.786678 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d619b935-7717-4e88-af76-97e946d3cef5-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d619b935-7717-4e88-af76-97e946d3cef5" (UID: "d619b935-7717-4e88-af76-97e946d3cef5"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.787347 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-config" (OuterVolumeSpecName: "config") pod "d619b935-7717-4e88-af76-97e946d3cef5" (UID: "d619b935-7717-4e88-af76-97e946d3cef5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.796285 4749 generic.go:334] "Generic (PLEG): container finished" podID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerID="02e31a093f3745f70d2f114e3071dcc2ff626d6fa46cc35da2f370cfb31b8cce" exitCode=143 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.796665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d76c7bb-6xh5q" event={"ID":"e422f911-d2a1-48ac-9ad7-9394647ad23c","Type":"ContainerDied","Data":"02e31a093f3745f70d2f114e3071dcc2ff626d6fa46cc35da2f370cfb31b8cce"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.827358 4749 generic.go:334] "Generic (PLEG): container finished" podID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerID="e944097794538d3c014641110ebd81f38c1dcad22ea2f4aa4c69b522c4b836ee" exitCode=143 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.827493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78bc78f9d8-g85sc" event={"ID":"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee","Type":"ContainerDied","Data":"e944097794538d3c014641110ebd81f38c1dcad22ea2f4aa4c69b522c4b836ee"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.832686 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d619b935-7717-4e88-af76-97e946d3cef5-kube-api-access-sxbvc" (OuterVolumeSpecName: "kube-api-access-sxbvc") pod "d619b935-7717-4e88-af76-97e946d3cef5" (UID: "d619b935-7717-4e88-af76-97e946d3cef5"). InnerVolumeSpecName "kube-api-access-sxbvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.834368 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.839118 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d619b935-7717-4e88-af76-97e946d3cef5" (UID: "d619b935-7717-4e88-af76-97e946d3cef5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.839125 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerID="d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09" exitCode=143 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.839428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" event={"ID":"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba","Type":"ContainerDied","Data":"d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09"} Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.841630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6565857b-6329-46a9-b8c0-4cad1019c4b9-kube-api-access-pmq9k" (OuterVolumeSpecName: "kube-api-access-pmq9k") pod "6565857b-6329-46a9-b8c0-4cad1019c4b9" (UID: "6565857b-6329-46a9-b8c0-4cad1019c4b9"). InnerVolumeSpecName "kube-api-access-pmq9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.859009 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican7ea4-account-delete-hg54s"] Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.876136 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.876165 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmq9k\" (UniqueName: \"kubernetes.io/projected/6565857b-6329-46a9-b8c0-4cad1019c4b9-kube-api-access-pmq9k\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.876189 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.876216 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d619b935-7717-4e88-af76-97e946d3cef5-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.876225 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d619b935-7717-4e88-af76-97e946d3cef5-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.876236 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxbvc\" (UniqueName: \"kubernetes.io/projected/d619b935-7717-4e88-af76-97e946d3cef5-kube-api-access-sxbvc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.876680 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.876830 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data podName:31a44203-fd94-4eb4-952f-d54a5c577095 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:38.876811736 +0000 UTC m=+1542.048961593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data") pod "rabbitmq-server-0" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095") : configmap "rabbitmq-config-data" not found Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.877389 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cindere578-account-delete-x9bh8"] Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.881560 4749 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Nov 29 01:36:37 crc kubenswrapper[4749]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 29 01:36:37 crc kubenswrapper[4749]: + source /usr/local/bin/container-scripts/functions Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNBridge=br-int Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNRemote=tcp:localhost:6642 Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNEncapType=geneve Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNAvailabilityZones= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ EnableChassisAsGateway=true Nov 29 01:36:37 crc kubenswrapper[4749]: ++ PhysicalNetworks= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNHostName= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 29 01:36:37 crc kubenswrapper[4749]: ++ ovs_dir=/var/lib/openvswitch Nov 29 01:36:37 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 29 01:36:37 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 29 01:36:37 crc kubenswrapper[4749]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 29 01:36:37 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 29 01:36:37 crc kubenswrapper[4749]: + sleep 0.5 Nov 29 01:36:37 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 29 01:36:37 crc kubenswrapper[4749]: + cleanup_ovsdb_server_semaphore Nov 29 01:36:37 crc kubenswrapper[4749]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 29 01:36:37 crc kubenswrapper[4749]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 29 01:36:37 crc kubenswrapper[4749]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-2k6f9" message=< Nov 29 01:36:37 crc kubenswrapper[4749]: Exiting ovsdb-server (5) [ OK ] Nov 29 01:36:37 crc kubenswrapper[4749]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 29 01:36:37 crc kubenswrapper[4749]: + source /usr/local/bin/container-scripts/functions Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNBridge=br-int Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNRemote=tcp:localhost:6642 Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNEncapType=geneve Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNAvailabilityZones= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ EnableChassisAsGateway=true Nov 29 01:36:37 crc kubenswrapper[4749]: ++ PhysicalNetworks= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNHostName= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 29 01:36:37 crc kubenswrapper[4749]: ++ ovs_dir=/var/lib/openvswitch Nov 29 01:36:37 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 29 01:36:37 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 29 01:36:37 crc kubenswrapper[4749]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 29 01:36:37 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 29 01:36:37 crc kubenswrapper[4749]: + sleep 0.5 Nov 29 01:36:37 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 29 01:36:37 crc kubenswrapper[4749]: + cleanup_ovsdb_server_semaphore Nov 29 01:36:37 crc kubenswrapper[4749]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 29 01:36:37 crc kubenswrapper[4749]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 29 01:36:37 crc kubenswrapper[4749]: > Nov 29 01:36:37 crc kubenswrapper[4749]: E1129 01:36:37.881601 4749 kuberuntime_container.go:691] "PreStop hook failed" err=< Nov 29 01:36:37 crc kubenswrapper[4749]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 29 01:36:37 crc kubenswrapper[4749]: + source /usr/local/bin/container-scripts/functions Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNBridge=br-int Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNRemote=tcp:localhost:6642 Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNEncapType=geneve Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNAvailabilityZones= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ EnableChassisAsGateway=true Nov 29 01:36:37 crc kubenswrapper[4749]: ++ PhysicalNetworks= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ OVNHostName= Nov 29 01:36:37 crc kubenswrapper[4749]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 29 01:36:37 crc kubenswrapper[4749]: ++ ovs_dir=/var/lib/openvswitch Nov 29 01:36:37 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 29 01:36:37 crc kubenswrapper[4749]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 29 01:36:37 crc kubenswrapper[4749]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 29 01:36:37 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 29 01:36:37 crc kubenswrapper[4749]: + sleep 0.5 Nov 29 01:36:37 crc kubenswrapper[4749]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 29 01:36:37 crc kubenswrapper[4749]: + cleanup_ovsdb_server_semaphore Nov 29 01:36:37 crc kubenswrapper[4749]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 29 01:36:37 crc kubenswrapper[4749]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 29 01:36:37 crc kubenswrapper[4749]: > pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" containerID="cri-o://6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.881632 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" containerID="cri-o://6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.908125 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d619b935-7717-4e88-af76-97e946d3cef5" (UID: "d619b935-7717-4e88-af76-97e946d3cef5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.912801 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerName="galera" containerID="cri-o://caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79" gracePeriod=30 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.929385 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6565857b-6329-46a9-b8c0-4cad1019c4b9" (UID: "6565857b-6329-46a9-b8c0-4cad1019c4b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.961041 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" containerName="rabbitmq" containerID="cri-o://eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1" gracePeriod=604800 Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.979421 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.979594 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.987903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-config" (OuterVolumeSpecName: "config") pod "6565857b-6329-46a9-b8c0-4cad1019c4b9" (UID: "6565857b-6329-46a9-b8c0-4cad1019c4b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.987959 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Nov 29 01:36:37 crc kubenswrapper[4749]: I1129 01:36:37.999771 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6565857b-6329-46a9-b8c0-4cad1019c4b9" (UID: "6565857b-6329-46a9-b8c0-4cad1019c4b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.032894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6565857b-6329-46a9-b8c0-4cad1019c4b9" (UID: "6565857b-6329-46a9-b8c0-4cad1019c4b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: W1129 01:36:38.054999 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a342ef2_ab29_4277_a7b4_0f83e5c3ca21.slice/crio-9dda204213f19bbad8e14e9252182cd300eb2193f6383fc3b4520229be854bb7 WatchSource:0}: Error finding container 9dda204213f19bbad8e14e9252182cd300eb2193f6383fc3b4520229be854bb7: Status 404 returned error can't find the container with id 9dda204213f19bbad8e14e9252182cd300eb2193f6383fc3b4520229be854bb7 Nov 29 01:36:38 crc kubenswrapper[4749]: W1129 01:36:38.064172 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc49a0e83_3d6d_47e1_8ca4_4cae46d23168.slice/crio-8b842dec2dc1cd24bc76e95b9d1000ebd3d3615daa533678174905e52c39c609 WatchSource:0}: Error finding container 8b842dec2dc1cd24bc76e95b9d1000ebd3d3615daa533678174905e52c39c609: Status 404 returned error can't find the container with id 8b842dec2dc1cd24bc76e95b9d1000ebd3d3615daa533678174905e52c39c609 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.081022 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.081048 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.081059 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.081069 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.149101 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-84b776bb8c-llx6x"] Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.149354 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-84b776bb8c-llx6x" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" containerName="proxy-httpd" containerID="cri-o://3456069d4e6f0aebd8654c15b636090a841895914222fd2b0f04601cdc235e31" gracePeriod=30 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.149825 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-84b776bb8c-llx6x" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" containerName="proxy-server" containerID="cri-o://b63a6f1d3dc68fbb3a1047eaf37bb7a13d0fbf9d362ca277af916b69381c4f06" gracePeriod=30 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.150432 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d619b935-7717-4e88-af76-97e946d3cef5" (UID: "d619b935-7717-4e88-af76-97e946d3cef5"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.165865 4749 scope.go:117] "RemoveContainer" containerID="9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.182675 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.190475 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" containerID="cri-o://cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" gracePeriod=29 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.190789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d619b935-7717-4e88-af76-97e946d3cef5" (UID: "d619b935-7717-4e88-af76-97e946d3cef5"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.220305 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6565857b-6329-46a9-b8c0-4cad1019c4b9" (UID: "6565857b-6329-46a9-b8c0-4cad1019c4b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.237791 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.284740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config\") pod \"5b102261-e51d-4b92-a267-167f1ffd0a41\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.284807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-combined-ca-bundle\") pod \"5b102261-e51d-4b92-a267-167f1ffd0a41\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.285116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgvtq\" (UniqueName: \"kubernetes.io/projected/5b102261-e51d-4b92-a267-167f1ffd0a41-kube-api-access-xgvtq\") pod \"5b102261-e51d-4b92-a267-167f1ffd0a41\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.285301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config-secret\") pod \"5b102261-e51d-4b92-a267-167f1ffd0a41\" (UID: \"5b102261-e51d-4b92-a267-167f1ffd0a41\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.285705 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d619b935-7717-4e88-af76-97e946d3cef5-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.285716 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6565857b-6329-46a9-b8c0-4cad1019c4b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.294577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b102261-e51d-4b92-a267-167f1ffd0a41-kube-api-access-xgvtq" (OuterVolumeSpecName: "kube-api-access-xgvtq") pod "5b102261-e51d-4b92-a267-167f1ffd0a41" (UID: "5b102261-e51d-4b92-a267-167f1ffd0a41"). InnerVolumeSpecName "kube-api-access-xgvtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.351575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b102261-e51d-4b92-a267-167f1ffd0a41" (UID: "5b102261-e51d-4b92-a267-167f1ffd0a41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.352456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5b102261-e51d-4b92-a267-167f1ffd0a41" (UID: "5b102261-e51d-4b92-a267-167f1ffd0a41"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.394326 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rxq45"] Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.394892 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgvtq\" (UniqueName: \"kubernetes.io/projected/5b102261-e51d-4b92-a267-167f1ffd0a41-kube-api-access-xgvtq\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.394985 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.395038 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.431276 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5b102261-e51d-4b92-a267-167f1ffd0a41" (UID: "5b102261-e51d-4b92-a267-167f1ffd0a41"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.448417 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-rxq45"] Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.456411 4749 scope.go:117] "RemoveContainer" containerID="f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9" Nov 29 01:36:38 crc kubenswrapper[4749]: E1129 01:36:38.462864 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9\": container with ID starting with f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9 not found: ID does not exist" containerID="f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.462897 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9"} err="failed to get container status \"f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9\": rpc error: code = NotFound desc = could not find container \"f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9\": container with ID starting with f3fd3309df0394829c883fd172a4eeeb80c4d33eb0fa83da21a2372c7a1899c9 not found: ID does not exist" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.462919 4749 scope.go:117] "RemoveContainer" containerID="9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068" Nov 29 01:36:38 crc kubenswrapper[4749]: E1129 01:36:38.464704 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068\": container with ID starting with 9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068 not found: ID does not exist" containerID="9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.464744 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068"} err="failed to get container status \"9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068\": rpc error: code = NotFound desc = could not find container \"9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068\": container with ID starting with 9fc374a3bfda6ae36e0abfcacf1c1b951f4dc7c696e669ff07b61b173ee97068 not found: ID does not exist" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.499323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.499384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46xk\" (UniqueName: \"kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:38 crc kubenswrapper[4749]: E1129 01:36:38.499452 4749 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Nov 29 01:36:38 crc kubenswrapper[4749]: E1129 01:36:38.499509 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts podName:eb4218c5-c97a-4e5d-8d70-28bc3d763d8a nodeName:}" failed. No retries permitted until 2025-11-29 01:36:40.49949286 +0000 UTC m=+1543.671642717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts") pod "novacell1089c-account-delete-d5r26" (UID: "eb4218c5-c97a-4e5d-8d70-28bc3d763d8a") : configmap "openstack-cell1-scripts" not found Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.499528 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5b102261-e51d-4b92-a267-167f1ffd0a41-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: E1129 01:36:38.503503 4749 projected.go:194] Error preparing data for projected volume kube-api-access-v46xk for pod openstack/novacell1089c-account-delete-d5r26: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Nov 29 01:36:38 crc kubenswrapper[4749]: E1129 01:36:38.503550 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk podName:eb4218c5-c97a-4e5d-8d70-28bc3d763d8a nodeName:}" failed. No retries permitted until 2025-11-29 01:36:40.50354012 +0000 UTC m=+1543.675689977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-v46xk" (UniqueName: "kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk") pod "novacell1089c-account-delete-d5r26" (UID: "eb4218c5-c97a-4e5d-8d70-28bc3d763d8a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.763459 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron5123-account-delete-qmfnv"] Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.807880 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement7db1-account-delete-x9f9k"] Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.860518 4749 generic.go:334] "Generic (PLEG): container finished" podID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerID="23c1f9a9695d62882d05b68f2a362774fd211e09dcdb543ab2bb762d27b58e29" exitCode=143 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.860578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" event={"ID":"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4","Type":"ContainerDied","Data":"23c1f9a9695d62882d05b68f2a362774fd211e09dcdb543ab2bb762d27b58e29"} Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.864843 4749 generic.go:334] "Generic (PLEG): container finished" podID="76910f08-d491-4b48-9439-78baad6ac3d3" containerID="b63a6f1d3dc68fbb3a1047eaf37bb7a13d0fbf9d362ca277af916b69381c4f06" exitCode=0 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.864876 4749 generic.go:334] "Generic (PLEG): container finished" podID="76910f08-d491-4b48-9439-78baad6ac3d3" containerID="3456069d4e6f0aebd8654c15b636090a841895914222fd2b0f04601cdc235e31" exitCode=0 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.864919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84b776bb8c-llx6x" event={"ID":"76910f08-d491-4b48-9439-78baad6ac3d3","Type":"ContainerDied","Data":"b63a6f1d3dc68fbb3a1047eaf37bb7a13d0fbf9d362ca277af916b69381c4f06"} Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.864946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84b776bb8c-llx6x" event={"ID":"76910f08-d491-4b48-9439-78baad6ac3d3","Type":"ContainerDied","Data":"3456069d4e6f0aebd8654c15b636090a841895914222fd2b0f04601cdc235e31"} Nov 29 01:36:38 crc kubenswrapper[4749]: W1129 01:36:38.865024 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5134b41_98a9_4555_9adb_c988862f59e6.slice/crio-6e56cdb4e94033244a20a4367d016650c4668ba9f08cec1c4f19b24a67ba9cc9 WatchSource:0}: Error finding container 6e56cdb4e94033244a20a4367d016650c4668ba9f08cec1c4f19b24a67ba9cc9: Status 404 returned error can't find the container with id 6e56cdb4e94033244a20a4367d016650c4668ba9f08cec1c4f19b24a67ba9cc9 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.865660 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6zwjh_9e2ea248-43ad-4851-96ea-3e6adaba3ef0/openstack-network-exporter/0.log" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.865737 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.898434 4749 scope.go:117] "RemoveContainer" containerID="b48098c5e4956d09db8ec03faedfc1ef2c22fcc5a22369e6385279648a81cfce" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.898576 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.905821 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-config\") pod \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.905895 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovn-rundir\") pod \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.905921 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-combined-ca-bundle\") pod \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.905942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbklg\" (UniqueName: \"kubernetes.io/projected/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-kube-api-access-zbklg\") pod \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.906087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "9e2ea248-43ad-4851-96ea-3e6adaba3ef0" (UID: "9e2ea248-43ad-4851-96ea-3e6adaba3ef0"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.906131 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-metrics-certs-tls-certs\") pod \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.906222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovs-rundir\") pod \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\" (UID: \"9e2ea248-43ad-4851-96ea-3e6adaba3ef0\") " Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.906675 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovn-rundir\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:38 crc kubenswrapper[4749]: E1129 01:36:38.906734 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 29 01:36:38 crc kubenswrapper[4749]: E1129 01:36:38.906782 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data podName:31a44203-fd94-4eb4-952f-d54a5c577095 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:40.906767881 +0000 UTC m=+1544.078917738 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data") pod "rabbitmq-server-0" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095") : configmap "rabbitmq-config-data" not found Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.906797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-config" (OuterVolumeSpecName: "config") pod "9e2ea248-43ad-4851-96ea-3e6adaba3ef0" (UID: "9e2ea248-43ad-4851-96ea-3e6adaba3ef0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.907076 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "9e2ea248-43ad-4851-96ea-3e6adaba3ef0" (UID: "9e2ea248-43ad-4851-96ea-3e6adaba3ef0"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.929581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-kube-api-access-zbklg" (OuterVolumeSpecName: "kube-api-access-zbklg") pod "9e2ea248-43ad-4851-96ea-3e6adaba3ef0" (UID: "9e2ea248-43ad-4851-96ea-3e6adaba3ef0"). InnerVolumeSpecName "kube-api-access-zbklg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.934178 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.937068 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" exitCode=0 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.937234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2k6f9" event={"ID":"c7f827fe-55e8-4f5d-a074-bd79f9029382","Type":"ContainerDied","Data":"6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9"} Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.975974 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4a087f05-8b7d-4207-88e8-1c622d57c653/ovsdbserver-sb/0.log" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.976005 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9lxvg" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.976038 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.976075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9lxvg" event={"ID":"65fd8520-689b-4f93-850e-bac0cec97025","Type":"ContainerDied","Data":"0d9a67beb6e686c0fc7eeb05bdbaea75575fe1c7b9a8e4037d3f79f6feccf60f"} Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.979915 4749 scope.go:117] "RemoveContainer" containerID="da6648012c83a6cc74d32fcd946afbf2e43e17fe059785d2a3c1297c4a85a109" Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.983293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindere578-account-delete-x9bh8" event={"ID":"c49a0e83-3d6d-47e1-8ca4-4cae46d23168","Type":"ContainerStarted","Data":"8b842dec2dc1cd24bc76e95b9d1000ebd3d3615daa533678174905e52c39c609"} Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.992345 4749 generic.go:334] "Generic (PLEG): container finished" podID="f91885d5-497a-41e4-9796-ca25f184b178" containerID="2156bc219030d070a0c4da6cd4ca235e2905d2b767d2af6d322ac73d528b20d1" exitCode=143 Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.996882 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi3ebb-account-delete-5jgft"] Nov 29 01:36:38 crc kubenswrapper[4749]: I1129 01:36:38.996916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f91885d5-497a-41e4-9796-ca25f184b178","Type":"ContainerDied","Data":"2156bc219030d070a0c4da6cd4ca235e2905d2b767d2af6d322ac73d528b20d1"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.002052 4749 generic.go:334] "Generic (PLEG): container finished" podID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerID="caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79" exitCode=0 Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.002122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5906d408-1c10-4c55-a07b-f94d302a08c6","Type":"ContainerDied","Data":"caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.007899 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-scripts\") pod \"4a087f05-8b7d-4207-88e8-1c622d57c653\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.007934 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run-ovn\") pod \"65fd8520-689b-4f93-850e-bac0cec97025\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008071 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65fd8520-689b-4f93-850e-bac0cec97025-scripts\") pod \"65fd8520-689b-4f93-850e-bac0cec97025\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008134 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-config\") pod \"4a087f05-8b7d-4207-88e8-1c622d57c653\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008153 4749 generic.go:334] "Generic (PLEG): container finished" podID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerID="8aabc98b71671061ad9285c52d012f1555e22985b6de832773cf911ed650a0eb" exitCode=143 Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008216 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4q7f\" (UniqueName: \"kubernetes.io/projected/4a087f05-8b7d-4207-88e8-1c622d57c653-kube-api-access-d4q7f\") pod \"4a087f05-8b7d-4207-88e8-1c622d57c653\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70f9e6e2-7ed0-4977-a558-27fb6a9d4001","Type":"ContainerDied","Data":"8aabc98b71671061ad9285c52d012f1555e22985b6de832773cf911ed650a0eb"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008271 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdbserver-sb-tls-certs\") pod \"4a087f05-8b7d-4207-88e8-1c622d57c653\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"4a087f05-8b7d-4207-88e8-1c622d57c653\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-ovn-controller-tls-certs\") pod \"65fd8520-689b-4f93-850e-bac0cec97025\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run\") pod \"65fd8520-689b-4f93-850e-bac0cec97025\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008503 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-metrics-certs-tls-certs\") pod \"4a087f05-8b7d-4207-88e8-1c622d57c653\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008541 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-combined-ca-bundle\") pod \"4a087f05-8b7d-4207-88e8-1c622d57c653\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008563 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-combined-ca-bundle\") pod \"65fd8520-689b-4f93-850e-bac0cec97025\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.008800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w89sp\" (UniqueName: \"kubernetes.io/projected/65fd8520-689b-4f93-850e-bac0cec97025-kube-api-access-w89sp\") pod \"65fd8520-689b-4f93-850e-bac0cec97025\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.009750 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65fd8520-689b-4f93-850e-bac0cec97025-scripts" (OuterVolumeSpecName: "scripts") pod "65fd8520-689b-4f93-850e-bac0cec97025" (UID: "65fd8520-689b-4f93-850e-bac0cec97025"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.010746 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-scripts" (OuterVolumeSpecName: "scripts") pod "4a087f05-8b7d-4207-88e8-1c622d57c653" (UID: "4a087f05-8b7d-4207-88e8-1c622d57c653"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.010779 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "65fd8520-689b-4f93-850e-bac0cec97025" (UID: "65fd8520-689b-4f93-850e-bac0cec97025"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.013095 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-log-ovn\") pod \"65fd8520-689b-4f93-850e-bac0cec97025\" (UID: \"65fd8520-689b-4f93-850e-bac0cec97025\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.013751 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdb-rundir\") pod \"4a087f05-8b7d-4207-88e8-1c622d57c653\" (UID: \"4a087f05-8b7d-4207-88e8-1c622d57c653\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.019649 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "4a087f05-8b7d-4207-88e8-1c622d57c653" (UID: "4a087f05-8b7d-4207-88e8-1c622d57c653"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.020157 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-config" (OuterVolumeSpecName: "config") pod "4a087f05-8b7d-4207-88e8-1c622d57c653" (UID: "4a087f05-8b7d-4207-88e8-1c622d57c653"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.021352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run" (OuterVolumeSpecName: "var-run") pod "65fd8520-689b-4f93-850e-bac0cec97025" (UID: "65fd8520-689b-4f93-850e-bac0cec97025"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.022082 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e2ea248-43ad-4851-96ea-3e6adaba3ef0" (UID: "9e2ea248-43ad-4851-96ea-3e6adaba3ef0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.025293 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "65fd8520-689b-4f93-850e-bac0cec97025" (UID: "65fd8520-689b-4f93-850e-bac0cec97025"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.027894 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6zwjh_9e2ea248-43ad-4851-96ea-3e6adaba3ef0/openstack-network-exporter/0.log" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.027988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6zwjh" event={"ID":"9e2ea248-43ad-4851-96ea-3e6adaba3ef0","Type":"ContainerDied","Data":"68f5326f39d1f34d4dae49a13444765c6df1393fec50100f7257dd359a5608a9"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.028186 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6zwjh" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.056565 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65fd8520-689b-4f93-850e-bac0cec97025-kube-api-access-w89sp" (OuterVolumeSpecName: "kube-api-access-w89sp") pod "65fd8520-689b-4f93-850e-bac0cec97025" (UID: "65fd8520-689b-4f93-850e-bac0cec97025"). InnerVolumeSpecName "kube-api-access-w89sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.059466 4749 generic.go:334] "Generic (PLEG): container finished" podID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerID="12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10" exitCode=0 Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.059570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b4fcb8ff-s4n9j" event={"ID":"04501dca-4c62-4065-abb5-fbdfb9ce76fc","Type":"ContainerDied","Data":"12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.060832 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061244 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65fd8520-689b-4f93-850e-bac0cec97025-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061296 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061311 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061325 4749 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-ovs-rundir\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061340 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061358 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/65fd8520-689b-4f93-850e-bac0cec97025-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061371 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061389 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061406 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbklg\" (UniqueName: \"kubernetes.io/projected/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-kube-api-access-zbklg\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.061424 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a087f05-8b7d-4207-88e8-1c622d57c653-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.075713 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "4a087f05-8b7d-4207-88e8-1c622d57c653" (UID: "4a087f05-8b7d-4207-88e8-1c622d57c653"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.077110 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4a087f05-8b7d-4207-88e8-1c622d57c653/ovsdbserver-sb/0.log" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.077158 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a087f05-8b7d-4207-88e8-1c622d57c653-kube-api-access-d4q7f" (OuterVolumeSpecName: "kube-api-access-d4q7f") pod "4a087f05-8b7d-4207-88e8-1c622d57c653" (UID: "4a087f05-8b7d-4207-88e8-1c622d57c653"). InnerVolumeSpecName "kube-api-access-d4q7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.080461 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.091124 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.097139 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerID="5ed421f49bc336804d73a5ee8923bd091bdb89ba7051efb3930f5c62c2dfef03" exitCode=0 Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.105675 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9e0c24-080d-40d1-a0b5-e796627265e7" path="/var/lib/kubelet/pods/1a9e0c24-080d-40d1-a0b5-e796627265e7/volumes" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.106339 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e2df569-0f1e-43d6-a4aa-983e7af9b753" path="/var/lib/kubelet/pods/1e2df569-0f1e-43d6-a4aa-983e7af9b753/volumes" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.106918 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38de4230-2953-456e-86a4-9c6a837a9592" path="/var/lib/kubelet/pods/38de4230-2953-456e-86a4-9c6a837a9592/volumes" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.130597 4749 scope.go:117] "RemoveContainer" containerID="23d0cb96cf04c81af3bc6f2003ca35fb96d25512a2dad19539c520ac601aba5c" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.130861 4749 generic.go:334] "Generic (PLEG): container finished" podID="823cce34-3656-4edf-9197-5586262263ec" containerID="5efba0e169acdd879816caeca72282b7bac80064e6baf6bb9765579a9239ea9d" exitCode=0 Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.131005 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.131247 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.131758 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.154038 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400b5586-62b7-420f-86ea-67d92778f6f2" path="/var/lib/kubelet/pods/400b5586-62b7-420f-86ea-67d92778f6f2/volumes" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.155585 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b102261-e51d-4b92-a267-167f1ffd0a41" path="/var/lib/kubelet/pods/5b102261-e51d-4b92-a267-167f1ffd0a41/volumes" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.156172 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6565857b-6329-46a9-b8c0-4cad1019c4b9" path="/var/lib/kubelet/pods/6565857b-6329-46a9-b8c0-4cad1019c4b9/volumes" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.157464 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b724452e-7b4b-4a1b-af78-754fee94a0b5" path="/var/lib/kubelet/pods/b724452e-7b4b-4a1b-af78-754fee94a0b5/volumes" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.158691 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d35e2c-828c-4e65-bd18-117a1b053783" path="/var/lib/kubelet/pods/e6d35e2c-828c-4e65-bd18-117a1b053783/volumes" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.166534 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-nova-novncproxy-tls-certs\") pod \"823cce34-3656-4edf-9197-5586262263ec\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.166631 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-config-data\") pod \"823cce34-3656-4edf-9197-5586262263ec\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.166760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g5v4\" (UniqueName: \"kubernetes.io/projected/823cce34-3656-4edf-9197-5586262263ec-kube-api-access-7g5v4\") pod \"823cce34-3656-4edf-9197-5586262263ec\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.166844 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-combined-ca-bundle\") pod \"823cce34-3656-4edf-9197-5586262263ec\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.166888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-vencrypt-tls-certs\") pod \"823cce34-3656-4edf-9197-5586262263ec\" (UID: \"823cce34-3656-4edf-9197-5586262263ec\") " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.167285 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4q7f\" (UniqueName: \"kubernetes.io/projected/4a087f05-8b7d-4207-88e8-1c622d57c653-kube-api-access-d4q7f\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.167311 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.167321 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w89sp\" (UniqueName: \"kubernetes.io/projected/65fd8520-689b-4f93-850e-bac0cec97025-kube-api-access-w89sp\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.167528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.191269 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell08947-account-delete-x2bgc"] Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.191309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4a087f05-8b7d-4207-88e8-1c622d57c653","Type":"ContainerDied","Data":"a6ff43f195e2ec7ab096b323215b989009c6e18373a4a264d1fcf19c85ccc8bd"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.191339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f","Type":"ContainerDied","Data":"5ed421f49bc336804d73a5ee8923bd091bdb89ba7051efb3930f5c62c2dfef03"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.191357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican7ea4-account-delete-hg54s" event={"ID":"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21","Type":"ContainerStarted","Data":"9dda204213f19bbad8e14e9252182cd300eb2193f6383fc3b4520229be854bb7"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.191369 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"823cce34-3656-4edf-9197-5586262263ec","Type":"ContainerDied","Data":"5efba0e169acdd879816caeca72282b7bac80064e6baf6bb9765579a9239ea9d"} Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.191387 4749 scope.go:117] "RemoveContainer" containerID="6e9657e12a54da3db09ff22f9ac8c9eb8a67c2424dfe7eae2c12bb076d0ac942" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.208241 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823cce34-3656-4edf-9197-5586262263ec-kube-api-access-7g5v4" (OuterVolumeSpecName: "kube-api-access-7g5v4") pod "823cce34-3656-4edf-9197-5586262263ec" (UID: "823cce34-3656-4edf-9197-5586262263ec"). InnerVolumeSpecName "kube-api-access-7g5v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.213774 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.228746 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.243583 4749 scope.go:117] "RemoveContainer" containerID="54b419e178792015e0c2b8fac863bb01960334b0e71784b51c9825ddfc510195" Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.251048 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79 is running failed: container process not found" containerID="caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.254898 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79 is running failed: container process not found" containerID="caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.256480 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79 is running failed: container process not found" containerID="caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.256558 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerName="galera" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.273375 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g5v4\" (UniqueName: \"kubernetes.io/projected/823cce34-3656-4edf-9197-5586262263ec-kube-api-access-7g5v4\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.273467 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.273519 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data podName:9a9603fe-72d8-479a-86be-9b914455fba1 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:43.273501827 +0000 UTC m=+1546.445651684 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data") pod "rabbitmq-cell1-server-0" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1") : configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.292040 4749 scope.go:117] "RemoveContainer" containerID="5efba0e169acdd879816caeca72282b7bac80064e6baf6bb9765579a9239ea9d" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.373042 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "823cce34-3656-4edf-9197-5586262263ec" (UID: "823cce34-3656-4edf-9197-5586262263ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.379148 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.484542 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.486718 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.542059 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65fd8520-689b-4f93-850e-bac0cec97025" (UID: "65fd8520-689b-4f93-850e-bac0cec97025"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.578484 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a087f05-8b7d-4207-88e8-1c622d57c653" (UID: "4a087f05-8b7d-4207-88e8-1c622d57c653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.588687 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.588710 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.607124 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-config-data" (OuterVolumeSpecName: "config-data") pod "823cce34-3656-4edf-9197-5586262263ec" (UID: "823cce34-3656-4edf-9197-5586262263ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.611480 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9e2ea248-43ad-4851-96ea-3e6adaba3ef0" (UID: "9e2ea248-43ad-4851-96ea-3e6adaba3ef0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.621240 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "823cce34-3656-4edf-9197-5586262263ec" (UID: "823cce34-3656-4edf-9197-5586262263ec"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.646182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "823cce34-3656-4edf-9197-5586262263ec" (UID: "823cce34-3656-4edf-9197-5586262263ec"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.679831 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4a087f05-8b7d-4207-88e8-1c622d57c653" (UID: "4a087f05-8b7d-4207-88e8-1c622d57c653"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.691549 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e2ea248-43ad-4851-96ea-3e6adaba3ef0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.691586 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.691623 4749 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.691640 4749 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.691652 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/823cce34-3656-4edf-9197-5586262263ec-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.703364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "65fd8520-689b-4f93-850e-bac0cec97025" (UID: "65fd8520-689b-4f93-850e-bac0cec97025"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.771373 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "4a087f05-8b7d-4207-88e8-1c622d57c653" (UID: "4a087f05-8b7d-4207-88e8-1c622d57c653"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.794799 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a087f05-8b7d-4207-88e8-1c622d57c653-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.794831 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/65fd8520-689b-4f93-850e-bac0cec97025-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.877760 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.878042 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.878230 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.878257 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.885727 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.893266 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.898285 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:39 crc kubenswrapper[4749]: E1129 01:36:39.898353 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.990161 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.990706 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="ceilometer-central-agent" containerID="cri-o://07a474eb6ac2724e1b0cdecbb5da046a4117a4bce874254d62a90db087962686" gracePeriod=30 Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.992329 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="proxy-httpd" containerID="cri-o://b6161f4f801200f63a69f8699a09c9b84f3a1ded5a4c508dc41de08731035fec" gracePeriod=30 Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.992380 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="ceilometer-notification-agent" containerID="cri-o://5fea5dc22f78f4b44307aafccca2e0586ca8c1965943165df879437b824209c1" gracePeriod=30 Nov 29 01:36:39 crc kubenswrapper[4749]: I1129 01:36:39.992509 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="sg-core" containerID="cri-o://9eed71d2ea4b0dda3706b093b511a6005492203beb6c1a0d7d09b12729581237" gracePeriod=30 Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.038326 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.038546 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="bf9cf96c-6bdd-425d-8983-4bfa2250edda" containerName="kube-state-metrics" containerID="cri-o://a92e13de7e8f2651d46dd244243eee9c0810da21b48c9f85f4c7ffacbccd381e" gracePeriod=30 Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.069168 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-75b4fcb8ff-s4n9j" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.148:9696/\": dial tcp 10.217.0.148:9696: connect: connection refused" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.084479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.110009 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-config-data\") pod \"76910f08-d491-4b48-9439-78baad6ac3d3\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.110099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-internal-tls-certs\") pod \"76910f08-d491-4b48-9439-78baad6ac3d3\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.110141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-run-httpd\") pod \"76910f08-d491-4b48-9439-78baad6ac3d3\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.110282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-combined-ca-bundle\") pod \"76910f08-d491-4b48-9439-78baad6ac3d3\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.110305 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqs4c\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-kube-api-access-bqs4c\") pod \"76910f08-d491-4b48-9439-78baad6ac3d3\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.110348 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-public-tls-certs\") pod \"76910f08-d491-4b48-9439-78baad6ac3d3\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.110367 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-log-httpd\") pod \"76910f08-d491-4b48-9439-78baad6ac3d3\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.110393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-etc-swift\") pod \"76910f08-d491-4b48-9439-78baad6ac3d3\" (UID: \"76910f08-d491-4b48-9439-78baad6ac3d3\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.120331 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "76910f08-d491-4b48-9439-78baad6ac3d3" (UID: "76910f08-d491-4b48-9439-78baad6ac3d3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.122046 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-kube-api-access-bqs4c" (OuterVolumeSpecName: "kube-api-access-bqs4c") pod "76910f08-d491-4b48-9439-78baad6ac3d3" (UID: "76910f08-d491-4b48-9439-78baad6ac3d3"). InnerVolumeSpecName "kube-api-access-bqs4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.122754 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "76910f08-d491-4b48-9439-78baad6ac3d3" (UID: "76910f08-d491-4b48-9439-78baad6ac3d3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.126386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "76910f08-d491-4b48-9439-78baad6ac3d3" (UID: "76910f08-d491-4b48-9439-78baad6ac3d3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.130157 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9lxvg"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.133976 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213094 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx4jh\" (UniqueName: \"kubernetes.io/projected/5906d408-1c10-4c55-a07b-f94d302a08c6-kube-api-access-tx4jh\") pod \"5906d408-1c10-4c55-a07b-f94d302a08c6\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-kolla-config\") pod \"5906d408-1c10-4c55-a07b-f94d302a08c6\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-default\") pod \"5906d408-1c10-4c55-a07b-f94d302a08c6\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213303 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-galera-tls-certs\") pod \"5906d408-1c10-4c55-a07b-f94d302a08c6\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213333 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-generated\") pod \"5906d408-1c10-4c55-a07b-f94d302a08c6\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5906d408-1c10-4c55-a07b-f94d302a08c6\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-operator-scripts\") pod \"5906d408-1c10-4c55-a07b-f94d302a08c6\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-combined-ca-bundle\") pod \"5906d408-1c10-4c55-a07b-f94d302a08c6\" (UID: \"5906d408-1c10-4c55-a07b-f94d302a08c6\") " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213830 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqs4c\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-kube-api-access-bqs4c\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213841 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213850 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/76910f08-d491-4b48-9439-78baad6ac3d3-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.213858 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76910f08-d491-4b48-9439-78baad6ac3d3-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.215093 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "5906d408-1c10-4c55-a07b-f94d302a08c6" (UID: "5906d408-1c10-4c55-a07b-f94d302a08c6"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.216389 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5906d408-1c10-4c55-a07b-f94d302a08c6" (UID: "5906d408-1c10-4c55-a07b-f94d302a08c6"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.219355 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5906d408-1c10-4c55-a07b-f94d302a08c6" (UID: "5906d408-1c10-4c55-a07b-f94d302a08c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.219594 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "5906d408-1c10-4c55-a07b-f94d302a08c6" (UID: "5906d408-1c10-4c55-a07b-f94d302a08c6"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.230413 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerID="f743ba6ff8b1da63b560049f4030c6a63de306ec90e8671821fcc72e7725c52a" exitCode=0 Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.230494 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f","Type":"ContainerDied","Data":"f743ba6ff8b1da63b560049f4030c6a63de306ec90e8671821fcc72e7725c52a"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.242010 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9lxvg"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.244013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5906d408-1c10-4c55-a07b-f94d302a08c6-kube-api-access-tx4jh" (OuterVolumeSpecName: "kube-api-access-tx4jh") pod "5906d408-1c10-4c55-a07b-f94d302a08c6" (UID: "5906d408-1c10-4c55-a07b-f94d302a08c6"). InnerVolumeSpecName "kube-api-access-tx4jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.292286 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.292469 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="3962a4be-25eb-45f6-8b1a-f84341319df3" containerName="memcached" containerID="cri-o://d8dc0505ade6c3a7b001fd49ab3118193dd6af3d47615b2a7ee7ceba6fef3ffc" gracePeriod=30 Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.295526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5906d408-1c10-4c55-a07b-f94d302a08c6","Type":"ContainerDied","Data":"2532e9289f06b649b0623532b9fa6e8c2f9c88941885cc1d45dc17aee1e33643"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.295577 4749 scope.go:117] "RemoveContainer" containerID="caf958884bb749802acac69c6cd81bd8a31a3db57010f0ebd7153e1d52d6de79" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.295688 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.304812 4749 generic.go:334] "Generic (PLEG): container finished" podID="1a342ef2-ab29-4277-a7b4-0f83e5c3ca21" containerID="66d79493b0a5b5a1671ff38a71caa57e6969e56432f8df85f19ce7f9774d8fbe" exitCode=0 Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.304986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican7ea4-account-delete-hg54s" event={"ID":"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21","Type":"ContainerDied","Data":"66d79493b0a5b5a1671ff38a71caa57e6969e56432f8df85f19ce7f9774d8fbe"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.308301 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.319528 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.325504 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx4jh\" (UniqueName: \"kubernetes.io/projected/5906d408-1c10-4c55-a07b-f94d302a08c6-kube-api-access-tx4jh\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.325582 4749 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.325684 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.326360 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5906d408-1c10-4c55-a07b-f94d302a08c6-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.320692 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": read tcp 10.217.0.2:56602->10.217.0.177:9292: read: connection reset by peer" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.320392 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": read tcp 10.217.0.2:56594->10.217.0.177:9292: read: connection reset by peer" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.350083 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.350935 4749 generic.go:334] "Generic (PLEG): container finished" podID="c49a0e83-3d6d-47e1-8ca4-4cae46d23168" containerID="3baf52548ae23fe4dec7847cae0b2d3819b57cfb6e4936dd7cf1ef54cfe2c8b3" exitCode=0 Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.351026 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindere578-account-delete-x9bh8" event={"ID":"c49a0e83-3d6d-47e1-8ca4-4cae46d23168","Type":"ContainerDied","Data":"3baf52548ae23fe4dec7847cae0b2d3819b57cfb6e4936dd7cf1ef54cfe2c8b3"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.358088 4749 scope.go:117] "RemoveContainer" containerID="d10c8e8eafcd0f30d5b2663c5e1152a35d0cef673fe747a79ecbeee6e1f7e5c1" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.361680 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.365314 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-84b776bb8c-llx6x" event={"ID":"76910f08-d491-4b48-9439-78baad6ac3d3","Type":"ContainerDied","Data":"bb70d12fcb9e5fb42dda77e67b8bad95433e6a3632c0ea01f446519d352d2834"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.365376 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.368734 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.371616 4749 generic.go:334] "Generic (PLEG): container finished" podID="132885d9-92b8-4cf2-b8e2-7f365fd0d020" containerID="286081c662919600ce871444f21fff6de8afe86aaa6b772098323d22bdea23b5" exitCode=0 Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.371665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance261d-account-delete-pndlc" event={"ID":"132885d9-92b8-4cf2-b8e2-7f365fd0d020","Type":"ContainerDied","Data":"286081c662919600ce871444f21fff6de8afe86aaa6b772098323d22bdea23b5"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.374498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell08947-account-delete-x2bgc" event={"ID":"cd7300ea-27b0-438e-981c-7b862054e630","Type":"ContainerStarted","Data":"ef8f8db098b5bc8512d86217bb4e3b30090a13e3fae92a174b5f6fba0e25561b"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.374523 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "5906d408-1c10-4c55-a07b-f94d302a08c6" (UID: "5906d408-1c10-4c55-a07b-f94d302a08c6"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.376855 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6zwjh"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.385325 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vzbzm"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.388502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi3ebb-account-delete-5jgft" event={"ID":"8cb80122-cbde-418d-8f3f-367087068603","Type":"ContainerStarted","Data":"5f1ec69a9ae451e2eb713f1c052ad20d15584974edf36c0cf96f0a6f18f30ace"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.390240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5123-account-delete-qmfnv" event={"ID":"d5134b41-98a9-4555-9adb-c988862f59e6","Type":"ContainerStarted","Data":"6e56cdb4e94033244a20a4367d016650c4668ba9f08cec1c4f19b24a67ba9cc9"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.391846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.392208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement7db1-account-delete-x9f9k" event={"ID":"7a220429-0482-4578-9268-d4127f8da9af","Type":"ContainerStarted","Data":"7f74bb0b5360f1feba86f4744283da2bcdd838d7e7ad8f935e865bede7d3fa8c"} Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.402981 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bvc2f"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.420057 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-6zwjh"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.430511 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.436761 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bvc2f"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.443703 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vzbzm"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.463060 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.467352 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5784c8bdbd-lvrsx"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.467633 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5784c8bdbd-lvrsx" podUID="28df216a-4f1e-449f-aaf6-45fd12929ad8" containerName="keystone-api" containerID="cri-o://41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349" gracePeriod=30 Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.496445 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="bf9cf96c-6bdd-425d-8983-4bfa2250edda" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.191:8081/readyz\": dial tcp 10.217.0.191:8081: connect: connection refused" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.506991 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-q6wqg"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.526245 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-q6wqg"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.530886 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6a3a-account-create-update-h42h7"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.532235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.532278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46xk\" (UniqueName: \"kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk\") pod \"novacell1089c-account-delete-d5r26\" (UID: \"eb4218c5-c97a-4e5d-8d70-28bc3d763d8a\") " pod="openstack/novacell1089c-account-delete-d5r26" Nov 29 01:36:40 crc kubenswrapper[4749]: E1129 01:36:40.532625 4749 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Nov 29 01:36:40 crc kubenswrapper[4749]: E1129 01:36:40.532661 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts podName:eb4218c5-c97a-4e5d-8d70-28bc3d763d8a nodeName:}" failed. No retries permitted until 2025-11-29 01:36:44.532648836 +0000 UTC m=+1547.704798683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts") pod "novacell1089c-account-delete-d5r26" (UID: "eb4218c5-c97a-4e5d-8d70-28bc3d763d8a") : configmap "openstack-cell1-scripts" not found Nov 29 01:36:40 crc kubenswrapper[4749]: E1129 01:36:40.535425 4749 projected.go:194] Error preparing data for projected volume kube-api-access-v46xk for pod openstack/novacell1089c-account-delete-d5r26: failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Nov 29 01:36:40 crc kubenswrapper[4749]: E1129 01:36:40.535477 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk podName:eb4218c5-c97a-4e5d-8d70-28bc3d763d8a nodeName:}" failed. No retries permitted until 2025-11-29 01:36:44.535465006 +0000 UTC m=+1547.707614863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-v46xk" (UniqueName: "kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk") pod "novacell1089c-account-delete-d5r26" (UID: "eb4218c5-c97a-4e5d-8d70-28bc3d763d8a") : failed to fetch token: serviceaccounts "galera-openstack-cell1" not found Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.543020 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6a3a-account-create-update-h42h7"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.565945 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-74cnm"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.630310 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-74cnm"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.638090 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cindere578-account-delete-x9bh8"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.665370 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e578-account-create-update-jwhdh"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.685970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "76910f08-d491-4b48-9439-78baad6ac3d3" (UID: "76910f08-d491-4b48-9439-78baad6ac3d3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.732341 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e578-account-create-update-jwhdh"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.739529 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.744358 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "76910f08-d491-4b48-9439-78baad6ac3d3" (UID: "76910f08-d491-4b48-9439-78baad6ac3d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.764700 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-78bc78f9d8-g85sc" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.149:8778/\": dial tcp 10.217.0.149:8778: connect: connection refused" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.764799 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-78bc78f9d8-g85sc" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.149:8778/\": dial tcp 10.217.0.149:8778: connect: connection refused" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.811565 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jq4xn"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.820455 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76910f08-d491-4b48-9439-78baad6ac3d3" (UID: "76910f08-d491-4b48-9439-78baad6ac3d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.820538 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jq4xn"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.841008 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.841042 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.842259 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-7ea4-account-create-update-x94pt"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.851685 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican7ea4-account-delete-hg54s"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.859306 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-7ea4-account-create-update-x94pt"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.861457 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.869215 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xrwj6"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.876486 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xrwj6"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.888820 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7db1-account-create-update-r9wsv"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.892794 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement7db1-account-delete-x9f9k"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.895995 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-744d76c7bb-6xh5q" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:43788->10.217.0.155:9311: read: connection reset by peer" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.897388 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-744d76c7bb-6xh5q" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.155:9311/healthcheck\": read tcp 10.217.0.2:43802->10.217.0.155:9311: read: connection reset by peer" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.902460 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:51006->10.217.0.203:8775: read: connection reset by peer" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.902478 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:51004->10.217.0.203:8775: read: connection reset by peer" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.908771 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7db1-account-create-update-r9wsv"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.912057 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8kwj6"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.917070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-config-data" (OuterVolumeSpecName: "config-data") pod "76910f08-d491-4b48-9439-78baad6ac3d3" (UID: "76910f08-d491-4b48-9439-78baad6ac3d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.919559 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8kwj6"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.933943 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5123-account-create-update-2hbcf"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.941049 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron5123-account-delete-qmfnv"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.943449 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.943466 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76910f08-d491-4b48-9439-78baad6ac3d3-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:40 crc kubenswrapper[4749]: E1129 01:36:40.943524 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 29 01:36:40 crc kubenswrapper[4749]: E1129 01:36:40.943569 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data podName:31a44203-fd94-4eb4-952f-d54a5c577095 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:44.943555628 +0000 UTC m=+1548.115705485 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data") pod "rabbitmq-server-0" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095") : configmap "rabbitmq-config-data" not found Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.947739 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5123-account-create-update-2hbcf"] Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.977989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5906d408-1c10-4c55-a07b-f94d302a08c6" (UID: "5906d408-1c10-4c55-a07b-f94d302a08c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:40 crc kubenswrapper[4749]: I1129 01:36:40.993568 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "5906d408-1c10-4c55-a07b-f94d302a08c6" (UID: "5906d408-1c10-4c55-a07b-f94d302a08c6"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.045355 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.045385 4749 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5906d408-1c10-4c55-a07b-f94d302a08c6-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.056319 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" containerName="galera" containerID="cri-o://30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254" gracePeriod=30 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.076033 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.076321 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.090430 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079e1231-eb4d-4e9b-b265-f1fd17be981c" path="/var/lib/kubelet/pods/079e1231-eb4d-4e9b-b265-f1fd17be981c/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.090971 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a041864-fa44-412d-a3ef-0d1af966cd48" path="/var/lib/kubelet/pods/3a041864-fa44-412d-a3ef-0d1af966cd48/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.091561 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c744d6e-38ea-451d-abe1-03208a580698" path="/var/lib/kubelet/pods/3c744d6e-38ea-451d-abe1-03208a580698/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.092612 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" path="/var/lib/kubelet/pods/4a087f05-8b7d-4207-88e8-1c622d57c653/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.093298 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca66ccd-e7ca-4cbc-84f7-5acafa38d494" path="/var/lib/kubelet/pods/5ca66ccd-e7ca-4cbc-84f7-5acafa38d494/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.093810 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65fd8520-689b-4f93-850e-bac0cec97025" path="/var/lib/kubelet/pods/65fd8520-689b-4f93-850e-bac0cec97025/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.103447 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823cce34-3656-4edf-9197-5586262263ec" path="/var/lib/kubelet/pods/823cce34-3656-4edf-9197-5586262263ec/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.104002 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91b13a5c-d047-40da-8a7f-87debe2da732" path="/var/lib/kubelet/pods/91b13a5c-d047-40da-8a7f-87debe2da732/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.104725 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bff88ad-b577-46b9-8bf4-4328b3684b6d" path="/var/lib/kubelet/pods/9bff88ad-b577-46b9-8bf4-4328b3684b6d/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.105248 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e2ea248-43ad-4851-96ea-3e6adaba3ef0" path="/var/lib/kubelet/pods/9e2ea248-43ad-4851-96ea-3e6adaba3ef0/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.106367 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab60b96f-60f6-436a-be55-f1d0edc65b01" path="/var/lib/kubelet/pods/ab60b96f-60f6-436a-be55-f1d0edc65b01/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.106846 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b1aed7-5e53-49ac-917c-adfacd5013fb" path="/var/lib/kubelet/pods/c9b1aed7-5e53-49ac-917c-adfacd5013fb/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.107336 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a" path="/var/lib/kubelet/pods/cb7bb7bc-84d7-4d67-ba9d-fd61bbfc973a/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.111620 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d619b935-7717-4e88-af76-97e946d3cef5" path="/var/lib/kubelet/pods/d619b935-7717-4e88-af76-97e946d3cef5/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.112182 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0133422-f6fd-44ca-b8d0-d9e0696aad80" path="/var/lib/kubelet/pods/e0133422-f6fd-44ca-b8d0-d9e0696aad80/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.112755 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53cda25-6fa5-4ec3-bf8f-686c8619e97f" path="/var/lib/kubelet/pods/f53cda25-6fa5-4ec3-bf8f-686c8619e97f/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.114301 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb744c9f-38bf-4a4a-8725-c917921e58c7" path="/var/lib/kubelet/pods/fb744c9f-38bf-4a4a-8725-c917921e58c7/volumes" Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.234116 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5 is running failed: container process not found" containerID="ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.239265 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5 is running failed: container process not found" containerID="ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.258383 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5 is running failed: container process not found" containerID="ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.258477 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="1bc6d8b8-4291-4f54-8bb2-508933b39c5a" containerName="nova-cell1-conductor-conductor" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.392308 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-c9grk"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.414228 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-c9grk"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.446154 4749 generic.go:334] "Generic (PLEG): container finished" podID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerID="5de820b5dca490141173e58c8807e85dfc695c9b0141c1364e49c90d3e7f477e" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.446259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70f9e6e2-7ed0-4977-a558-27fb6a9d4001","Type":"ContainerDied","Data":"5de820b5dca490141173e58c8807e85dfc695c9b0141c1364e49c90d3e7f477e"} Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.449352 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.451635 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8947-account-create-update-n8bjz"] Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.452015 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.478493 4749 generic.go:334] "Generic (PLEG): container finished" podID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerID="405921777b6096485df779e9382966efbbcbdbe565e378910a31ae009c151eab" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.478579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d76c7bb-6xh5q" event={"ID":"e422f911-d2a1-48ac-9ad7-9394647ad23c","Type":"ContainerDied","Data":"405921777b6096485df779e9382966efbbcbdbe565e378910a31ae009c151eab"} Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.478632 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.478661 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="ovn-northd" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.494273 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell08947-account-delete-x2bgc"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.509530 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi3ebb-account-delete-5jgft" event={"ID":"8cb80122-cbde-418d-8f3f-367087068603","Type":"ContainerStarted","Data":"6ab8ab3317b880f167426788d0636c2f6f74ab5861bf7c160fed8a33923e7357"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.510124 4749 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi3ebb-account-delete-5jgft" secret="" err="secret \"galera-openstack-dockercfg-twcjs\" not found" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.523250 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8947-account-create-update-n8bjz"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.535024 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-rn8bz"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.546408 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-rn8bz"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.552247 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3ebb-account-create-update-nltbl"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.555827 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement7db1-account-delete-x9f9k" podUID="7a220429-0482-4578-9268-d4127f8da9af" containerName="mariadb-account-delete" containerID="cri-o://5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47" gracePeriod=30 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.555899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement7db1-account-delete-x9f9k" event={"ID":"7a220429-0482-4578-9268-d4127f8da9af","Type":"ContainerStarted","Data":"5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47"} Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.561140 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.561185 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts podName:8cb80122-cbde-418d-8f3f-367087068603 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:42.061171576 +0000 UTC m=+1545.233321433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts") pod "novaapi3ebb-account-delete-5jgft" (UID: "8cb80122-cbde-418d-8f3f-367087068603") : configmap "openstack-scripts" not found Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.570590 4749 generic.go:334] "Generic (PLEG): container finished" podID="f91885d5-497a-41e4-9796-ca25f184b178" containerID="42f0d632bd584d2a9e5b8e0e0d9b68c2c2cb9b1edeb4d0f8370b7e18d29880ce" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.570651 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f91885d5-497a-41e4-9796-ca25f184b178","Type":"ContainerDied","Data":"42f0d632bd584d2a9e5b8e0e0d9b68c2c2cb9b1edeb4d0f8370b7e18d29880ce"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.572290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5123-account-delete-qmfnv" event={"ID":"d5134b41-98a9-4555-9adb-c988862f59e6","Type":"ContainerStarted","Data":"27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.572763 4749 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutron5123-account-delete-qmfnv" secret="" err="secret \"galera-openstack-dockercfg-twcjs\" not found" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.573478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell08947-account-delete-x2bgc" event={"ID":"cd7300ea-27b0-438e-981c-7b862054e630","Type":"ContainerStarted","Data":"11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.573897 4749 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell08947-account-delete-x2bgc" secret="" err="secret \"galera-openstack-dockercfg-twcjs\" not found" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.575052 4749 generic.go:334] "Generic (PLEG): container finished" podID="1bc6d8b8-4291-4f54-8bb2-508933b39c5a" containerID="ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.575086 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1bc6d8b8-4291-4f54-8bb2-508933b39c5a","Type":"ContainerDied","Data":"ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.596338 4749 generic.go:334] "Generic (PLEG): container finished" podID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerID="e2aab48732b202c3e72e85886215cd2e441185bbc74a2b999dc3bff8161959f4" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.596393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78bc78f9d8-g85sc" event={"ID":"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee","Type":"ContainerDied","Data":"e2aab48732b202c3e72e85886215cd2e441185bbc74a2b999dc3bff8161959f4"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.596415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78bc78f9d8-g85sc" event={"ID":"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee","Type":"ContainerDied","Data":"f02706978598899783ee8425f468bbc0b7a506676a6f50d0825f628c8152bd66"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.596425 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f02706978598899783ee8425f468bbc0b7a506676a6f50d0825f628c8152bd66" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.600381 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi3ebb-account-delete-5jgft"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.610169 4749 generic.go:334] "Generic (PLEG): container finished" podID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerID="60d404c089766276e2b776b298e5f3050db4dd04153458e59685a427fe248694" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.610249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8","Type":"ContainerDied","Data":"60d404c089766276e2b776b298e5f3050db4dd04153458e59685a427fe248694"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.610270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8","Type":"ContainerDied","Data":"416ec9c7231deeac6705ee0bfb2bb8c5d8b38be6094685561cfe5fe231cdac45"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.610282 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="416ec9c7231deeac6705ee0bfb2bb8c5d8b38be6094685561cfe5fe231cdac45" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.614394 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3ebb-account-create-update-nltbl"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.624301 4749 generic.go:334] "Generic (PLEG): container finished" podID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerID="81bb8b04fc234401df04769295fca821df1009b777f50ebd985ac7c880a1e11d" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.624351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff73098b-8f03-43ac-9e1d-3ac7edd2589d","Type":"ContainerDied","Data":"81bb8b04fc234401df04769295fca821df1009b777f50ebd985ac7c880a1e11d"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.624637 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi3ebb-account-delete-5jgft" podStartSLOduration=5.62461642 podStartE2EDuration="5.62461642s" podCreationTimestamp="2025-11-29 01:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:36:41.538976436 +0000 UTC m=+1544.711126303" watchObservedRunningTime="2025-11-29 01:36:41.62461642 +0000 UTC m=+1544.796766277" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.629397 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement7db1-account-delete-x9f9k" podStartSLOduration=6.629387908 podStartE2EDuration="6.629387908s" podCreationTimestamp="2025-11-29 01:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:36:41.573897332 +0000 UTC m=+1544.746047189" watchObservedRunningTime="2025-11-29 01:36:41.629387908 +0000 UTC m=+1544.801537765" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.633850 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell08947-account-delete-x2bgc" podStartSLOduration=5.633824508 podStartE2EDuration="5.633824508s" podCreationTimestamp="2025-11-29 01:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:36:41.587502939 +0000 UTC m=+1544.759652796" watchObservedRunningTime="2025-11-29 01:36:41.633824508 +0000 UTC m=+1544.805974365" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.635765 4749 generic.go:334] "Generic (PLEG): container finished" podID="bf9cf96c-6bdd-425d-8983-4bfa2250edda" containerID="a92e13de7e8f2651d46dd244243eee9c0810da21b48c9f85f4c7ffacbccd381e" exitCode=2 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.635817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bf9cf96c-6bdd-425d-8983-4bfa2250edda","Type":"ContainerDied","Data":"a92e13de7e8f2651d46dd244243eee9c0810da21b48c9f85f4c7ffacbccd381e"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.635836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bf9cf96c-6bdd-425d-8983-4bfa2250edda","Type":"ContainerDied","Data":"b92d1c424ab81556d1e4b3ee820ddd216d8a65d6492810aa143aec4ff410ee2a"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.635848 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b92d1c424ab81556d1e4b3ee820ddd216d8a65d6492810aa143aec4ff410ee2a" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.673421 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron5123-account-delete-qmfnv" podStartSLOduration=6.67339869 podStartE2EDuration="6.67339869s" podCreationTimestamp="2025-11-29 01:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 01:36:41.609235118 +0000 UTC m=+1544.781384975" watchObservedRunningTime="2025-11-29 01:36:41.67339869 +0000 UTC m=+1544.845548547" Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.685325 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.687870 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts podName:d5134b41-98a9-4555-9adb-c988862f59e6 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:42.187851928 +0000 UTC m=+1545.360001785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts") pod "neutron5123-account-delete-qmfnv" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6") : configmap "openstack-scripts" not found Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.701266 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.701364 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts podName:cd7300ea-27b0-438e-981c-7b862054e630 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:42.201343123 +0000 UTC m=+1545.373492980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts") pod "novacell08947-account-delete-x2bgc" (UID: "cd7300ea-27b0-438e-981c-7b862054e630") : configmap "openstack-scripts" not found Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.727428 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b451af15ca02cf0e5e8a766761169627777887698c277c719a9b70488450c040" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.737892 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b451af15ca02cf0e5e8a766761169627777887698c277c719a9b70488450c040" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.737936 4749 generic.go:334] "Generic (PLEG): container finished" podID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerID="b6161f4f801200f63a69f8699a09c9b84f3a1ded5a4c508dc41de08731035fec" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.737973 4749 generic.go:334] "Generic (PLEG): container finished" podID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerID="9eed71d2ea4b0dda3706b093b511a6005492203beb6c1a0d7d09b12729581237" exitCode=2 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.737982 4749 generic.go:334] "Generic (PLEG): container finished" podID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerID="07a474eb6ac2724e1b0cdecbb5da046a4117a4bce874254d62a90db087962686" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.738083 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerDied","Data":"b6161f4f801200f63a69f8699a09c9b84f3a1ded5a4c508dc41de08731035fec"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.738111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerDied","Data":"9eed71d2ea4b0dda3706b093b511a6005492203beb6c1a0d7d09b12729581237"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.738135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerDied","Data":"07a474eb6ac2724e1b0cdecbb5da046a4117a4bce874254d62a90db087962686"} Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.742592 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b451af15ca02cf0e5e8a766761169627777887698c277c719a9b70488450c040" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 01:36:41 crc kubenswrapper[4749]: E1129 01:36:41.742652 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="26a4b5e6-f82a-4316-a7e8-d596136086c2" containerName="nova-cell0-conductor-conductor" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.747895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f","Type":"ContainerDied","Data":"906f7a1c77acb0c1c05f8358dd94558118893ab2ab4fa0d59651e9cfe0745337"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.747933 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="906f7a1c77acb0c1c05f8358dd94558118893ab2ab4fa0d59651e9cfe0745337" Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.771627 4749 generic.go:334] "Generic (PLEG): container finished" podID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerID="0957fdee810eb3eb8c31f26782750e24fed0ee07cdb3eb0ee23af55d2e009a5c" exitCode=0 Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.771713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8715ecba-f08b-4fcd-b129-d9e9c568e087","Type":"ContainerDied","Data":"0957fdee810eb3eb8c31f26782750e24fed0ee07cdb3eb0ee23af55d2e009a5c"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.771743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8715ecba-f08b-4fcd-b129-d9e9c568e087","Type":"ContainerDied","Data":"4967f27014785bf65a50fb6d24a7f2f85534c68587ab6453c12a25785023d064"} Nov 29 01:36:41 crc kubenswrapper[4749]: I1129 01:36:41.771754 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4967f27014785bf65a50fb6d24a7f2f85534c68587ab6453c12a25785023d064" Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.024244 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910 is running failed: container process not found" containerID="7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.024806 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910 is running failed: container process not found" containerID="7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.025568 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910 is running failed: container process not found" containerID="7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.025639 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f5792ec0-0d00-47e7-8d9d-d3133cd1e695" containerName="nova-scheduler-scheduler" Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.099081 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.099168 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts podName:8cb80122-cbde-418d-8f3f-367087068603 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:43.099140919 +0000 UTC m=+1546.271290776 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts") pod "novaapi3ebb-account-delete-5jgft" (UID: "8cb80122-cbde-418d-8f3f-367087068603") : configmap "openstack-scripts" not found Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.165028 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.177574 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.201430 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.202069 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts podName:d5134b41-98a9-4555-9adb-c988862f59e6 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:43.201958029 +0000 UTC m=+1546.374107896 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts") pod "neutron5123-account-delete-qmfnv" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6") : configmap "openstack-scripts" not found Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.225128 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.225478 4749 scope.go:117] "RemoveContainer" containerID="b63a6f1d3dc68fbb3a1047eaf37bb7a13d0fbf9d362ca277af916b69381c4f06" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.225883 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.230626 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.241260 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.242909 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.254238 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.254399 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.254728 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.267004 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.267410 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1089c-account-delete-d5r26"] Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.273127 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1089c-account-delete-d5r26"] Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.275249 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.281331 4749 scope.go:117] "RemoveContainer" containerID="3456069d4e6f0aebd8654c15b636090a841895914222fd2b0f04601cdc235e31" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.290313 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715ecba-f08b-4fcd-b129-d9e9c568e087-logs\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data\") pod \"e422f911-d2a1-48ac-9ad7-9394647ad23c\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-public-tls-certs\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302524 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-combined-ca-bundle\") pod \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302543 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302562 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data-custom\") pod \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-combined-ca-bundle\") pod \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302608 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-combined-ca-bundle\") pod \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302631 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-internal-tls-certs\") pod \"e422f911-d2a1-48ac-9ad7-9394647ad23c\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-config-data\") pod \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-public-tls-certs\") pod \"e422f911-d2a1-48ac-9ad7-9394647ad23c\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-public-tls-certs\") pod \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-scripts\") pod \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-logs\") pod \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.302743 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e422f911-d2a1-48ac-9ad7-9394647ad23c-logs\") pod \"e422f911-d2a1-48ac-9ad7-9394647ad23c\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.305394 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-config-data\") pod \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.305577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4x7q\" (UniqueName: \"kubernetes.io/projected/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-kube-api-access-g4x7q\") pod \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.305745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-internal-tls-certs\") pod \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306112 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-combined-ca-bundle\") pod \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306365 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data\") pod \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-scripts\") pod \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k25qb\" (UniqueName: \"kubernetes.io/projected/8715ecba-f08b-4fcd-b129-d9e9c568e087-kube-api-access-k25qb\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-nova-metadata-tls-certs\") pod \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-scripts\") pod \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306802 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-logs\") pod \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-combined-ca-bundle\") pod \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.306851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-public-tls-certs\") pod \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.310955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data-custom\") pod \"e422f911-d2a1-48ac-9ad7-9394647ad23c\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.311402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.311485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-internal-tls-certs\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.311558 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-logs\") pod \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.311765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8j7m\" (UniqueName: \"kubernetes.io/projected/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-kube-api-access-c8j7m\") pod \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.311942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72h6\" (UniqueName: \"kubernetes.io/projected/e422f911-d2a1-48ac-9ad7-9394647ad23c-kube-api-access-w72h6\") pod \"e422f911-d2a1-48ac-9ad7-9394647ad23c\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.312041 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-scripts\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.312108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data-custom\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.312237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z895p\" (UniqueName: \"kubernetes.io/projected/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-kube-api-access-z895p\") pod \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.312310 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-config-data\") pod \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.312370 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-logs\") pod \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.312441 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-internal-tls-certs\") pod \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.312531 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" (UID: "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.312551 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-etc-machine-id\") pod \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppsgf\" (UniqueName: \"kubernetes.io/projected/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-kube-api-access-ppsgf\") pod \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-combined-ca-bundle\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5vdc\" (UniqueName: \"kubernetes.io/projected/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-kube-api-access-w5vdc\") pod \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\" (UID: \"dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313266 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngssb\" (UniqueName: \"kubernetes.io/projected/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-api-access-ngssb\") pod \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-certs\") pod \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8715ecba-f08b-4fcd-b129-d9e9c568e087-etc-machine-id\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-scripts\") pod \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\" (UID: \"3b929be5-bc3e-47c6-8ac4-0ada6d740a7f\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-config-data\") pod \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313363 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-config\") pod \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-combined-ca-bundle\") pod \"e422f911-d2a1-48ac-9ad7-9394647ad23c\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313535 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-combined-ca-bundle\") pod \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-httpd-run\") pod \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\" (UID: \"ff73098b-8f03-43ac-9e1d-3ac7edd2589d\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.313645 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-httpd-run\") pod \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\" (UID: \"6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.314417 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.314501 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:42 crc kubenswrapper[4749]: E1129 01:36:42.314569 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts podName:cd7300ea-27b0-438e-981c-7b862054e630 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:43.314554552 +0000 UTC m=+1546.486704409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts") pod "novacell08947-account-delete-x2bgc" (UID: "cd7300ea-27b0-438e-981c-7b862054e630") : configmap "openstack-scripts" not found Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.315233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8715ecba-f08b-4fcd-b129-d9e9c568e087-logs" (OuterVolumeSpecName: "logs") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.318081 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-logs" (OuterVolumeSpecName: "logs") pod "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" (UID: "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.318756 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e422f911-d2a1-48ac-9ad7-9394647ad23c-logs" (OuterVolumeSpecName: "logs") pod "e422f911-d2a1-48ac-9ad7-9394647ad23c" (UID: "e422f911-d2a1-48ac-9ad7-9394647ad23c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.329379 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "ff73098b-8f03-43ac-9e1d-3ac7edd2589d" (UID: "ff73098b-8f03-43ac-9e1d-3ac7edd2589d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.329693 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8715ecba-f08b-4fcd-b129-d9e9c568e087-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.336832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-logs" (OuterVolumeSpecName: "logs") pod "ff73098b-8f03-43ac-9e1d-3ac7edd2589d" (UID: "ff73098b-8f03-43ac-9e1d-3ac7edd2589d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.340374 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-scripts" (OuterVolumeSpecName: "scripts") pod "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" (UID: "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.343170 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff73098b-8f03-43ac-9e1d-3ac7edd2589d" (UID: "ff73098b-8f03-43ac-9e1d-3ac7edd2589d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.348730 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-scripts" (OuterVolumeSpecName: "scripts") pod "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" (UID: "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.352445 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-api-access-ngssb" (OuterVolumeSpecName: "kube-api-access-ngssb") pod "bf9cf96c-6bdd-425d-8983-4bfa2250edda" (UID: "bf9cf96c-6bdd-425d-8983-4bfa2250edda"). InnerVolumeSpecName "kube-api-access-ngssb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.353246 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" (UID: "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.362128 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" (UID: "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.397564 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-scripts" (OuterVolumeSpecName: "scripts") pod "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" (UID: "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.397664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" (UID: "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.399220 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-kube-api-access-g4x7q" (OuterVolumeSpecName: "kube-api-access-g4x7q") pod "ff73098b-8f03-43ac-9e1d-3ac7edd2589d" (UID: "ff73098b-8f03-43ac-9e1d-3ac7edd2589d"). InnerVolumeSpecName "kube-api-access-g4x7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.399302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8715ecba-f08b-4fcd-b129-d9e9c568e087-kube-api-access-k25qb" (OuterVolumeSpecName: "kube-api-access-k25qb") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "kube-api-access-k25qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.401853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-logs" (OuterVolumeSpecName: "logs") pod "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" (UID: "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.401895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-logs" (OuterVolumeSpecName: "logs") pod "70f9e6e2-7ed0-4977-a558-27fb6a9d4001" (UID: "70f9e6e2-7ed0-4977-a558-27fb6a9d4001"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.401974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e422f911-d2a1-48ac-9ad7-9394647ad23c" (UID: "e422f911-d2a1-48ac-9ad7-9394647ad23c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.402032 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-scripts" (OuterVolumeSpecName: "scripts") pod "ff73098b-8f03-43ac-9e1d-3ac7edd2589d" (UID: "ff73098b-8f03-43ac-9e1d-3ac7edd2589d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.402124 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-kube-api-access-ppsgf" (OuterVolumeSpecName: "kube-api-access-ppsgf") pod "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" (UID: "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8"). InnerVolumeSpecName "kube-api-access-ppsgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.402179 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-kube-api-access-w5vdc" (OuterVolumeSpecName: "kube-api-access-w5vdc") pod "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" (UID: "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee"). InnerVolumeSpecName "kube-api-access-w5vdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.414922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e422f911-d2a1-48ac-9ad7-9394647ad23c-kube-api-access-w72h6" (OuterVolumeSpecName: "kube-api-access-w72h6") pod "e422f911-d2a1-48ac-9ad7-9394647ad23c" (UID: "e422f911-d2a1-48ac-9ad7-9394647ad23c"). InnerVolumeSpecName "kube-api-access-w72h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.415372 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-scripts" (OuterVolumeSpecName: "scripts") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.415518 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-kube-api-access-z895p" (OuterVolumeSpecName: "kube-api-access-z895p") pod "70f9e6e2-7ed0-4977-a558-27fb6a9d4001" (UID: "70f9e6e2-7ed0-4977-a558-27fb6a9d4001"). InnerVolumeSpecName "kube-api-access-z895p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.415580 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.427072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-config-data\") pod \"f91885d5-497a-41e4-9796-ca25f184b178\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.427923 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vzhv\" (UniqueName: \"kubernetes.io/projected/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-kube-api-access-5vzhv\") pod \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-combined-ca-bundle\") pod \"f91885d5-497a-41e4-9796-ca25f184b178\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429360 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-logs\") pod \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429422 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gthg\" (UniqueName: \"kubernetes.io/projected/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-kube-api-access-9gthg\") pod \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429728 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data-custom\") pod \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429756 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-config-data\") pod \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429801 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-combined-ca-bundle\") pod \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\" (UID: \"1bc6d8b8-4291-4f54-8bb2-508933b39c5a\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429828 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72h6\" (UniqueName: \"kubernetes.io/projected/e422f911-d2a1-48ac-9ad7-9394647ad23c-kube-api-access-w72h6\") pod \"e422f911-d2a1-48ac-9ad7-9394647ad23c\" (UID: \"e422f911-d2a1-48ac-9ad7-9394647ad23c\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429847 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-scripts\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwbjr\" (UniqueName: \"kubernetes.io/projected/f91885d5-497a-41e4-9796-ca25f184b178-kube-api-access-pwbjr\") pod \"f91885d5-497a-41e4-9796-ca25f184b178\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.429882 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data-custom\") pod \"8715ecba-f08b-4fcd-b129-d9e9c568e087\" (UID: \"8715ecba-f08b-4fcd-b129-d9e9c568e087\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.431359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z895p\" (UniqueName: \"kubernetes.io/projected/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-kube-api-access-z895p\") pod \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\" (UID: \"70f9e6e2-7ed0-4977-a558-27fb6a9d4001\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.431434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91885d5-497a-41e4-9796-ca25f184b178-logs\") pod \"f91885d5-497a-41e4-9796-ca25f184b178\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.431452 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-internal-tls-certs\") pod \"f91885d5-497a-41e4-9796-ca25f184b178\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.431476 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data\") pod \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " Nov 29 01:36:42 crc kubenswrapper[4749]: W1129 01:36:42.431720 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e422f911-d2a1-48ac-9ad7-9394647ad23c/volumes/kubernetes.io~projected/kube-api-access-w72h6 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.431763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-public-tls-certs\") pod \"f91885d5-497a-41e4-9796-ca25f184b178\" (UID: \"f91885d5-497a-41e4-9796-ca25f184b178\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.431788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e422f911-d2a1-48ac-9ad7-9394647ad23c-kube-api-access-w72h6" (OuterVolumeSpecName: "kube-api-access-w72h6") pod "e422f911-d2a1-48ac-9ad7-9394647ad23c" (UID: "e422f911-d2a1-48ac-9ad7-9394647ad23c"). InnerVolumeSpecName "kube-api-access-w72h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.431807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-combined-ca-bundle\") pod \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\" (UID: \"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba\") " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432493 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8715ecba-f08b-4fcd-b129-d9e9c568e087-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432512 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432525 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432538 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432548 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8715ecba-f08b-4fcd-b129-d9e9c568e087-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432567 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432576 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432585 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432593 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e422f911-d2a1-48ac-9ad7-9394647ad23c-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432604 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4x7q\" (UniqueName: \"kubernetes.io/projected/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-kube-api-access-g4x7q\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432614 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432622 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432630 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k25qb\" (UniqueName: \"kubernetes.io/projected/8715ecba-f08b-4fcd-b129-d9e9c568e087-kube-api-access-k25qb\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432641 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432649 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432652 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-logs" (OuterVolumeSpecName: "logs") pod "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" (UID: "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432659 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432702 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v46xk\" (UniqueName: \"kubernetes.io/projected/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a-kube-api-access-v46xk\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432714 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432725 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w72h6\" (UniqueName: \"kubernetes.io/projected/e422f911-d2a1-48ac-9ad7-9394647ad23c-kube-api-access-w72h6\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432739 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432760 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432772 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432782 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppsgf\" (UniqueName: \"kubernetes.io/projected/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-kube-api-access-ppsgf\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5vdc\" (UniqueName: \"kubernetes.io/projected/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-kube-api-access-w5vdc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.432803 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngssb\" (UniqueName: \"kubernetes.io/projected/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-api-access-ngssb\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: W1129 01:36:42.433456 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8715ecba-f08b-4fcd-b129-d9e9c568e087/volumes/kubernetes.io~secret/scripts Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.433531 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-scripts" (OuterVolumeSpecName: "scripts") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: W1129 01:36:42.433555 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/70f9e6e2-7ed0-4977-a558-27fb6a9d4001/volumes/kubernetes.io~projected/kube-api-access-z895p Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.433698 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-kube-api-access-z895p" (OuterVolumeSpecName: "kube-api-access-z895p") pod "70f9e6e2-7ed0-4977-a558-27fb6a9d4001" (UID: "70f9e6e2-7ed0-4977-a558-27fb6a9d4001"). InnerVolumeSpecName "kube-api-access-z895p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: W1129 01:36:42.433707 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8715ecba-f08b-4fcd-b129-d9e9c568e087/volumes/kubernetes.io~secret/config-data-custom Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.433807 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.434163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91885d5-497a-41e4-9796-ca25f184b178-logs" (OuterVolumeSpecName: "logs") pod "f91885d5-497a-41e4-9796-ca25f184b178" (UID: "f91885d5-497a-41e4-9796-ca25f184b178"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.449857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-kube-api-access-c8j7m" (OuterVolumeSpecName: "kube-api-access-c8j7m") pod "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" (UID: "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f"). InnerVolumeSpecName "kube-api-access-c8j7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.455570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-kube-api-access-5vzhv" (OuterVolumeSpecName: "kube-api-access-5vzhv") pod "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" (UID: "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba"). InnerVolumeSpecName "kube-api-access-5vzhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.455591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" (UID: "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.482422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91885d5-497a-41e4-9796-ca25f184b178-kube-api-access-pwbjr" (OuterVolumeSpecName: "kube-api-access-pwbjr") pod "f91885d5-497a-41e4-9796-ca25f184b178" (UID: "f91885d5-497a-41e4-9796-ca25f184b178"). InnerVolumeSpecName "kube-api-access-pwbjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.488330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-kube-api-access-9gthg" (OuterVolumeSpecName: "kube-api-access-9gthg") pod "1bc6d8b8-4291-4f54-8bb2-508933b39c5a" (UID: "1bc6d8b8-4291-4f54-8bb2-508933b39c5a"). InnerVolumeSpecName "kube-api-access-9gthg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537816 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f91885d5-497a-41e4-9796-ca25f184b178-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537846 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vzhv\" (UniqueName: \"kubernetes.io/projected/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-kube-api-access-5vzhv\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537856 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537864 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gthg\" (UniqueName: \"kubernetes.io/projected/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-kube-api-access-9gthg\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537874 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537882 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8j7m\" (UniqueName: \"kubernetes.io/projected/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-kube-api-access-c8j7m\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537891 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537899 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwbjr\" (UniqueName: \"kubernetes.io/projected/f91885d5-497a-41e4-9796-ca25f184b178-kube-api-access-pwbjr\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537908 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.537916 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z895p\" (UniqueName: \"kubernetes.io/projected/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-kube-api-access-z895p\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.541185 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.640882 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.651595 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.656115 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-config-data" (OuterVolumeSpecName: "config-data") pod "1bc6d8b8-4291-4f54-8bb2-508933b39c5a" (UID: "1bc6d8b8-4291-4f54-8bb2-508933b39c5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.689284 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff73098b-8f03-43ac-9e1d-3ac7edd2589d" (UID: "ff73098b-8f03-43ac-9e1d-3ac7edd2589d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.706873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" (UID: "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.724915 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-config-data" (OuterVolumeSpecName: "config-data") pod "70f9e6e2-7ed0-4977-a558-27fb6a9d4001" (UID: "70f9e6e2-7ed0-4977-a558-27fb6a9d4001"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.740245 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.741942 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf9cf96c-6bdd-425d-8983-4bfa2250edda" (UID: "bf9cf96c-6bdd-425d-8983-4bfa2250edda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-combined-ca-bundle\") pod \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\" (UID: \"bf9cf96c-6bdd-425d-8983-4bfa2250edda\") " Nov 29 01:36:42 crc kubenswrapper[4749]: W1129 01:36:42.743164 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bf9cf96c-6bdd-425d-8983-4bfa2250edda/volumes/kubernetes.io~secret/combined-ca-bundle Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743180 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf9cf96c-6bdd-425d-8983-4bfa2250edda" (UID: "bf9cf96c-6bdd-425d-8983-4bfa2250edda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743480 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743499 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743508 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743517 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743526 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743535 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.743542 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.795734 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e422f911-d2a1-48ac-9ad7-9394647ad23c" (UID: "e422f911-d2a1-48ac-9ad7-9394647ad23c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.795951 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican7ea4-account-delete-hg54s" event={"ID":"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21","Type":"ContainerDied","Data":"9dda204213f19bbad8e14e9252182cd300eb2193f6383fc3b4520229be854bb7"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.796346 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dda204213f19bbad8e14e9252182cd300eb2193f6383fc3b4520229be854bb7" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.800513 4749 generic.go:334] "Generic (PLEG): container finished" podID="26a4b5e6-f82a-4316-a7e8-d596136086c2" containerID="b451af15ca02cf0e5e8a766761169627777887698c277c719a9b70488450c040" exitCode=0 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.800568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"26a4b5e6-f82a-4316-a7e8-d596136086c2","Type":"ContainerDied","Data":"b451af15ca02cf0e5e8a766761169627777887698c277c719a9b70488450c040"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.800596 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"26a4b5e6-f82a-4316-a7e8-d596136086c2","Type":"ContainerDied","Data":"f2218f87fab37470c221217079b5947f5bfbf515b4dc7243a51a1e40c0ceab0d"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.800606 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2218f87fab37470c221217079b5947f5bfbf515b4dc7243a51a1e40c0ceab0d" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.804079 4749 generic.go:334] "Generic (PLEG): container finished" podID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerID="5fea5dc22f78f4b44307aafccca2e0586ca8c1965943165df879437b824209c1" exitCode=0 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.804137 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerDied","Data":"5fea5dc22f78f4b44307aafccca2e0586ca8c1965943165df879437b824209c1"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.804170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3db8c8b1-f222-46a2-9a4a-e9e48f27802c","Type":"ContainerDied","Data":"4359ec511f961657d2664590717f11dbb2e8947c2d0bff9fea5d94067b10e986"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.804185 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4359ec511f961657d2664590717f11dbb2e8947c2d0bff9fea5d94067b10e986" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.806326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e422f911-d2a1-48ac-9ad7-9394647ad23c" (UID: "e422f911-d2a1-48ac-9ad7-9394647ad23c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.808401 4749 generic.go:334] "Generic (PLEG): container finished" podID="3962a4be-25eb-45f6-8b1a-f84341319df3" containerID="d8dc0505ade6c3a7b001fd49ab3118193dd6af3d47615b2a7ee7ceba6fef3ffc" exitCode=0 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.808472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3962a4be-25eb-45f6-8b1a-f84341319df3","Type":"ContainerDied","Data":"d8dc0505ade6c3a7b001fd49ab3118193dd6af3d47615b2a7ee7ceba6fef3ffc"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.808500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3962a4be-25eb-45f6-8b1a-f84341319df3","Type":"ContainerDied","Data":"d7a6822b2bd25f98130f4ad5d9e194fdde848d8e6de8b38846c9ae549b197ff8"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.808513 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7a6822b2bd25f98130f4ad5d9e194fdde848d8e6de8b38846c9ae549b197ff8" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.825050 4749 generic.go:334] "Generic (PLEG): container finished" podID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerID="ac6b10cbf89933157f87ce0832e498f620cc616f9e2d34fbbac8faa2f0a1cdbd" exitCode=0 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.825151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" event={"ID":"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4","Type":"ContainerDied","Data":"ac6b10cbf89933157f87ce0832e498f620cc616f9e2d34fbbac8faa2f0a1cdbd"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.825186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" event={"ID":"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4","Type":"ContainerDied","Data":"9f3b91d6584f697e324f451409378a786e169b21cd99c1d50e0345989aeddf8c"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.825217 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f3b91d6584f697e324f451409378a786e169b21cd99c1d50e0345989aeddf8c" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.827426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-744d76c7bb-6xh5q" event={"ID":"e422f911-d2a1-48ac-9ad7-9394647ad23c","Type":"ContainerDied","Data":"d7b0c26bec26e466ebc66b267c17b31bec562429e5b71d626ce13e2764f2a0c0"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.827469 4749 scope.go:117] "RemoveContainer" containerID="405921777b6096485df779e9382966efbbcbdbe565e378910a31ae009c151eab" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.827567 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-744d76c7bb-6xh5q" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.832353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff73098b-8f03-43ac-9e1d-3ac7edd2589d","Type":"ContainerDied","Data":"7eefed1f9a37f25e0cea60beaf548d1974bfd1e9132a54bed7b64b56c7b2c78e"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.832492 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.846717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cindere578-account-delete-x9bh8" event={"ID":"c49a0e83-3d6d-47e1-8ca4-4cae46d23168","Type":"ContainerDied","Data":"8b842dec2dc1cd24bc76e95b9d1000ebd3d3615daa533678174905e52c39c609"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.846748 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b842dec2dc1cd24bc76e95b9d1000ebd3d3615daa533678174905e52c39c609" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.846907 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.846934 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.853316 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "bf9cf96c-6bdd-425d-8983-4bfa2250edda" (UID: "bf9cf96c-6bdd-425d-8983-4bfa2250edda"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.864343 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff73098b-8f03-43ac-9e1d-3ac7edd2589d" (UID: "ff73098b-8f03-43ac-9e1d-3ac7edd2589d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.864391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.864961 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5792ec0-0d00-47e7-8d9d-d3133cd1e695" containerID="7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910" exitCode=0 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.865035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5792ec0-0d00-47e7-8d9d-d3133cd1e695","Type":"ContainerDied","Data":"7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.865304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5792ec0-0d00-47e7-8d9d-d3133cd1e695","Type":"ContainerDied","Data":"a8a9c886da8ff00a85d21209001adbbe25ae6a312e2dd9fdc5c169207ccff283"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.865326 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8a9c886da8ff00a85d21209001adbbe25ae6a312e2dd9fdc5c169207ccff283" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.867837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f91885d5-497a-41e4-9796-ca25f184b178","Type":"ContainerDied","Data":"f2b443d970a95528d88d248d726b79e84330d35a18bee21c818cbecd0765b0bf"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.867933 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.874252 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerID="3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e" exitCode=0 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.874289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" event={"ID":"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba","Type":"ContainerDied","Data":"3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.874305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" event={"ID":"e3d65a6e-107e-4ffe-8561-49b1dd2e9aba","Type":"ContainerDied","Data":"06b0e720ffdf3ec5411ab49d6dd1866ab6e45dffdb43a288497d385ec8e175b5"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.874343 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56469f8b8-ckfjz" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.879942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70f9e6e2-7ed0-4977-a558-27fb6a9d4001","Type":"ContainerDied","Data":"44de507991c18908d1c9bc05646c35ae0db12206dec2019de193655acb26ec96"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.879999 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.883486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance261d-account-delete-pndlc" event={"ID":"132885d9-92b8-4cf2-b8e2-7f365fd0d020","Type":"ContainerDied","Data":"f0c806610465678d9c16dad7e18f9792372a73b28f94f676f0104cfd51a81a20"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.883509 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0c806610465678d9c16dad7e18f9792372a73b28f94f676f0104cfd51a81a20" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.885652 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell08947-account-delete-x2bgc" podUID="cd7300ea-27b0-438e-981c-7b862054e630" containerName="mariadb-account-delete" containerID="cri-o://11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699" gracePeriod=30 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.889285 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78bc78f9d8-g85sc" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.889512 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.889555 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.889568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1bc6d8b8-4291-4f54-8bb2-508933b39c5a","Type":"ContainerDied","Data":"328b54dca585a1b974545c1971fe79ba96509cf4f8ea37a2e72792b56929f041"} Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.889624 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.889652 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.889682 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.889744 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron5123-account-delete-qmfnv" podUID="d5134b41-98a9-4555-9adb-c988862f59e6" containerName="mariadb-account-delete" containerID="cri-o://27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f" gracePeriod=30 Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.890135 4749 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi3ebb-account-delete-5jgft" secret="" err="secret \"galera-openstack-dockercfg-twcjs\" not found" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.914502 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "70f9e6e2-7ed0-4977-a558-27fb6a9d4001" (UID: "70f9e6e2-7ed0-4977-a558-27fb6a9d4001"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.914604 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bc6d8b8-4291-4f54-8bb2-508933b39c5a" (UID: "1bc6d8b8-4291-4f54-8bb2-508933b39c5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.929374 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-config-data" (OuterVolumeSpecName: "config-data") pod "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" (UID: "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.930035 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e422f911-d2a1-48ac-9ad7-9394647ad23c" (UID: "e422f911-d2a1-48ac-9ad7-9394647ad23c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.930329 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70f9e6e2-7ed0-4977-a558-27fb6a9d4001" (UID: "70f9e6e2-7ed0-4977-a558-27fb6a9d4001"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.936880 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" (UID: "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.937707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f91885d5-497a-41e4-9796-ca25f184b178" (UID: "f91885d5-497a-41e4-9796-ca25f184b178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.947391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-config-data" (OuterVolumeSpecName: "config-data") pod "f91885d5-497a-41e4-9796-ca25f184b178" (UID: "f91885d5-497a-41e4-9796-ca25f184b178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950023 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950051 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950063 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950077 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950089 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950102 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc6d8b8-4291-4f54-8bb2-508933b39c5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950113 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950125 4749 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950136 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950148 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70f9e6e2-7ed0-4977-a558-27fb6a9d4001-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.950158 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.953760 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data" (OuterVolumeSpecName: "config-data") pod "e422f911-d2a1-48ac-9ad7-9394647ad23c" (UID: "e422f911-d2a1-48ac-9ad7-9394647ad23c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.965377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" (UID: "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.965564 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.976155 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" (UID: "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:42 crc kubenswrapper[4749]: I1129 01:36:42.976579 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:42.992459 4749 scope.go:117] "RemoveContainer" containerID="02e31a093f3745f70d2f114e3071dcc2ff626d6fa46cc35da2f370cfb31b8cce" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:42.998427 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.004467 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.021220 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-config-data" (OuterVolumeSpecName: "config-data") pod "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" (UID: "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.042433 4749 scope.go:117] "RemoveContainer" containerID="81bb8b04fc234401df04769295fca821df1009b777f50ebd985ac7c880a1e11d" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.049702 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.050997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-config-data\") pod \"3962a4be-25eb-45f6-8b1a-f84341319df3\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051079 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-logs\") pod \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-operator-scripts\") pod \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\" (UID: \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051163 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data\") pod \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051180 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8gjw\" (UniqueName: \"kubernetes.io/projected/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-kube-api-access-d8gjw\") pod \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-kolla-config\") pod \"3962a4be-25eb-45f6-8b1a-f84341319df3\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54xx2\" (UniqueName: \"kubernetes.io/projected/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-kube-api-access-54xx2\") pod \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\" (UID: \"c49a0e83-3d6d-47e1-8ca4-4cae46d23168\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051298 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-combined-ca-bundle\") pod \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051360 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-memcached-tls-certs\") pod \"3962a4be-25eb-45f6-8b1a-f84341319df3\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051386 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/132885d9-92b8-4cf2-b8e2-7f365fd0d020-operator-scripts\") pod \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\" (UID: \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051412 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data-custom\") pod \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\" (UID: \"d74c0b83-feac-4f92-b5cf-1ec7c9e298a4\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051459 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89lml\" (UniqueName: \"kubernetes.io/projected/132885d9-92b8-4cf2-b8e2-7f365fd0d020-kube-api-access-89lml\") pod \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\" (UID: \"132885d9-92b8-4cf2-b8e2-7f365fd0d020\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051480 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-combined-ca-bundle\") pod \"3962a4be-25eb-45f6-8b1a-f84341319df3\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051498 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjhml\" (UniqueName: \"kubernetes.io/projected/3962a4be-25eb-45f6-8b1a-f84341319df3-kube-api-access-kjhml\") pod \"3962a4be-25eb-45f6-8b1a-f84341319df3\" (UID: \"3962a4be-25eb-45f6-8b1a-f84341319df3\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051529 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-config-data" (OuterVolumeSpecName: "config-data") pod "3962a4be-25eb-45f6-8b1a-f84341319df3" (UID: "3962a4be-25eb-45f6-8b1a-f84341319df3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051809 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051821 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e422f911-d2a1-48ac-9ad7-9394647ad23c-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051830 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051838 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.051846 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.052492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/132885d9-92b8-4cf2-b8e2-7f365fd0d020-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "132885d9-92b8-4cf2-b8e2-7f365fd0d020" (UID: "132885d9-92b8-4cf2-b8e2-7f365fd0d020"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.055064 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-logs" (OuterVolumeSpecName: "logs") pod "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" (UID: "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.055460 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c49a0e83-3d6d-47e1-8ca4-4cae46d23168" (UID: "c49a0e83-3d6d-47e1-8ca4-4cae46d23168"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.058830 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/132885d9-92b8-4cf2-b8e2-7f365fd0d020-kube-api-access-89lml" (OuterVolumeSpecName: "kube-api-access-89lml") pod "132885d9-92b8-4cf2-b8e2-7f365fd0d020" (UID: "132885d9-92b8-4cf2-b8e2-7f365fd0d020"). InnerVolumeSpecName "kube-api-access-89lml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.062617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" (UID: "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.062664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data" (OuterVolumeSpecName: "config-data") pod "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" (UID: "e3d65a6e-107e-4ffe-8561-49b1dd2e9aba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.062689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" (UID: "6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.062838 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-kube-api-access-d8gjw" (OuterVolumeSpecName: "kube-api-access-d8gjw") pod "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" (UID: "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4"). InnerVolumeSpecName "kube-api-access-d8gjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.067361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3962a4be-25eb-45f6-8b1a-f84341319df3-kube-api-access-kjhml" (OuterVolumeSpecName: "kube-api-access-kjhml") pod "3962a4be-25eb-45f6-8b1a-f84341319df3" (UID: "3962a4be-25eb-45f6-8b1a-f84341319df3"). InnerVolumeSpecName "kube-api-access-kjhml". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.067419 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-kube-api-access-54xx2" (OuterVolumeSpecName: "kube-api-access-54xx2") pod "c49a0e83-3d6d-47e1-8ca4-4cae46d23168" (UID: "c49a0e83-3d6d-47e1-8ca4-4cae46d23168"). InnerVolumeSpecName "kube-api-access-54xx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.072007 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3962a4be-25eb-45f6-8b1a-f84341319df3" (UID: "3962a4be-25eb-45f6-8b1a-f84341319df3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.073370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" (UID: "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.103802 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.111730 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.140256 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f91885d5-497a-41e4-9796-ca25f184b178" (UID: "f91885d5-497a-41e4-9796-ca25f184b178"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.142704 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.144072 4749 scope.go:117] "RemoveContainer" containerID="f72654b1f6476b877d4a3b3f9bd36b7ac0424fa98f947f3608445049745b170c" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.145781 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d" path="/var/lib/kubelet/pods/0fa47a44-8bb1-4569-8c3e-fcc4d6d9160d/volumes" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.148486 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39408fef-9da8-4fde-b501-421748576739" path="/var/lib/kubelet/pods/39408fef-9da8-4fde-b501-421748576739/volumes" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.150274 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.150428 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5906d408-1c10-4c55-a07b-f94d302a08c6" path="/var/lib/kubelet/pods/5906d408-1c10-4c55-a07b-f94d302a08c6/volumes" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.151993 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60229df5-f068-4b05-ba77-f02234f912f7" path="/var/lib/kubelet/pods/60229df5-f068-4b05-ba77-f02234f912f7/volumes" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.153091 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4218c5-c97a-4e5d-8d70-28bc3d763d8a" path="/var/lib/kubelet/pods/eb4218c5-c97a-4e5d-8d70-28bc3d763d8a/volumes" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.154566 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edfd4dba-036b-469a-bd29-185017dbfa55" path="/var/lib/kubelet/pods/edfd4dba-036b-469a-bd29-185017dbfa55/volumes" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-sg-core-conf-yaml\") pod \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-combined-ca-bundle\") pod \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-ceilometer-tls-certs\") pod \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-run-httpd\") pod \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-operator-scripts\") pod \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\" (UID: \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155585 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-config-data\") pod \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155614 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-combined-ca-bundle\") pod \"26a4b5e6-f82a-4316-a7e8-d596136086c2\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-scripts\") pod \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155673 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-config-data\") pod \"26a4b5e6-f82a-4316-a7e8-d596136086c2\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxcn4\" (UniqueName: \"kubernetes.io/projected/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-kube-api-access-pxcn4\") pod \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdzzn\" (UniqueName: \"kubernetes.io/projected/26a4b5e6-f82a-4316-a7e8-d596136086c2-kube-api-access-vdzzn\") pod \"26a4b5e6-f82a-4316-a7e8-d596136086c2\" (UID: \"26a4b5e6-f82a-4316-a7e8-d596136086c2\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155872 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx74p\" (UniqueName: \"kubernetes.io/projected/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-kube-api-access-xx74p\") pod \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\" (UID: \"1a342ef2-ab29-4277-a7b4-0f83e5c3ca21\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.155913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-log-httpd\") pod \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\" (UID: \"3db8c8b1-f222-46a2-9a4a-e9e48f27802c\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.159048 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3db8c8b1-f222-46a2-9a4a-e9e48f27802c" (UID: "3db8c8b1-f222-46a2-9a4a-e9e48f27802c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160043 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3db8c8b1-f222-46a2-9a4a-e9e48f27802c" (UID: "3db8c8b1-f222-46a2-9a4a-e9e48f27802c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160540 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160584 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/132885d9-92b8-4cf2-b8e2-7f365fd0d020-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160602 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160616 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160629 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89lml\" (UniqueName: \"kubernetes.io/projected/132885d9-92b8-4cf2-b8e2-7f365fd0d020-kube-api-access-89lml\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160651 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjhml\" (UniqueName: \"kubernetes.io/projected/3962a4be-25eb-45f6-8b1a-f84341319df3-kube-api-access-kjhml\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160663 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160672 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160685 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-logs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.160696 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.169505 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.169587 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.169647 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts podName:8cb80122-cbde-418d-8f3f-367087068603 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:45.169627139 +0000 UTC m=+1548.341776996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts") pod "novaapi3ebb-account-delete-5jgft" (UID: "8cb80122-cbde-418d-8f3f-367087068603") : configmap "openstack-scripts" not found Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.171085 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8gjw\" (UniqueName: \"kubernetes.io/projected/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-kube-api-access-d8gjw\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.171463 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-kube-api-access-xx74p" (OuterVolumeSpecName: "kube-api-access-xx74p") pod "1a342ef2-ab29-4277-a7b4-0f83e5c3ca21" (UID: "1a342ef2-ab29-4277-a7b4-0f83e5c3ca21"). InnerVolumeSpecName "kube-api-access-xx74p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.172219 4749 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3962a4be-25eb-45f6-8b1a-f84341319df3-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.172235 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.172252 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.172263 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54xx2\" (UniqueName: \"kubernetes.io/projected/c49a0e83-3d6d-47e1-8ca4-4cae46d23168-kube-api-access-54xx2\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.182274 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-config-data" (OuterVolumeSpecName: "config-data") pod "ff73098b-8f03-43ac-9e1d-3ac7edd2589d" (UID: "ff73098b-8f03-43ac-9e1d-3ac7edd2589d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.182646 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a342ef2-ab29-4277-a7b4-0f83e5c3ca21" (UID: "1a342ef2-ab29-4277-a7b4-0f83e5c3ca21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.183136 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-kube-api-access-pxcn4" (OuterVolumeSpecName: "kube-api-access-pxcn4") pod "3db8c8b1-f222-46a2-9a4a-e9e48f27802c" (UID: "3db8c8b1-f222-46a2-9a4a-e9e48f27802c"). InnerVolumeSpecName "kube-api-access-pxcn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.188702 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f91885d5-497a-41e4-9796-ca25f184b178" (UID: "f91885d5-497a-41e4-9796-ca25f184b178"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.190871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-scripts" (OuterVolumeSpecName: "scripts") pod "3db8c8b1-f222-46a2-9a4a-e9e48f27802c" (UID: "3db8c8b1-f222-46a2-9a4a-e9e48f27802c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.190989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a4b5e6-f82a-4316-a7e8-d596136086c2-kube-api-access-vdzzn" (OuterVolumeSpecName: "kube-api-access-vdzzn") pod "26a4b5e6-f82a-4316-a7e8-d596136086c2" (UID: "26a4b5e6-f82a-4316-a7e8-d596136086c2"). InnerVolumeSpecName "kube-api-access-vdzzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.191130 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data" (OuterVolumeSpecName: "config-data") pod "8715ecba-f08b-4fcd-b129-d9e9c568e087" (UID: "8715ecba-f08b-4fcd-b129-d9e9c568e087"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.194998 4749 scope.go:117] "RemoveContainer" containerID="42f0d632bd584d2a9e5b8e0e0d9b68c2c2cb9b1edeb4d0f8370b7e18d29880ce" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.222874 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "bf9cf96c-6bdd-425d-8983-4bfa2250edda" (UID: "bf9cf96c-6bdd-425d-8983-4bfa2250edda"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.222955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" (UID: "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.232667 4749 scope.go:117] "RemoveContainer" containerID="2156bc219030d070a0c4da6cd4ca235e2905d2b767d2af6d322ac73d528b20d1" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.235536 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-config-data" (OuterVolumeSpecName: "config-data") pod "26a4b5e6-f82a-4316-a7e8-d596136086c2" (UID: "26a4b5e6-f82a-4316-a7e8-d596136086c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.249421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3962a4be-25eb-45f6-8b1a-f84341319df3" (UID: "3962a4be-25eb-45f6-8b1a-f84341319df3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.252879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26a4b5e6-f82a-4316-a7e8-d596136086c2" (UID: "26a4b5e6-f82a-4316-a7e8-d596136086c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.257897 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data" (OuterVolumeSpecName: "config-data") pod "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" (UID: "3b929be5-bc3e-47c6-8ac4-0ada6d740a7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.261906 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data" (OuterVolumeSpecName: "config-data") pod "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" (UID: "d74c0b83-feac-4f92-b5cf-1ec7c9e298a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqlvg\" (UniqueName: \"kubernetes.io/projected/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-kube-api-access-kqlvg\") pod \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-combined-ca-bundle\") pod \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277292 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-config-data\") pod \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\" (UID: \"f5792ec0-0d00-47e7-8d9d-d3133cd1e695\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277530 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3db8c8b1-f222-46a2-9a4a-e9e48f27802c" (UID: "3db8c8b1-f222-46a2-9a4a-e9e48f27802c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.277604 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.277650 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts podName:d5134b41-98a9-4555-9adb-c988862f59e6 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:45.277634718 +0000 UTC m=+1548.449784575 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts") pod "neutron5123-account-delete-qmfnv" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6") : configmap "openstack-scripts" not found Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277748 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277759 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277768 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff73098b-8f03-43ac-9e1d-3ac7edd2589d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277776 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277784 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277793 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277803 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277811 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8715ecba-f08b-4fcd-b129-d9e9c568e087-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277818 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277827 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26a4b5e6-f82a-4316-a7e8-d596136086c2-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277835 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxcn4\" (UniqueName: \"kubernetes.io/projected/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-kube-api-access-pxcn4\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277844 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f91885d5-497a-41e4-9796-ca25f184b178-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277852 4749 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf9cf96c-6bdd-425d-8983-4bfa2250edda-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277861 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277869 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdzzn\" (UniqueName: \"kubernetes.io/projected/26a4b5e6-f82a-4316-a7e8-d596136086c2-kube-api-access-vdzzn\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.277877 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx74p\" (UniqueName: \"kubernetes.io/projected/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21-kube-api-access-xx74p\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.279528 4749 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.279568 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data podName:9a9603fe-72d8-479a-86be-9b914455fba1 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:51.279556286 +0000 UTC m=+1554.451706223 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data") pod "rabbitmq-cell1-server-0" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1") : configmap "rabbitmq-cell1-config-data" not found Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.283813 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-kube-api-access-kqlvg" (OuterVolumeSpecName: "kube-api-access-kqlvg") pod "f5792ec0-0d00-47e7-8d9d-d3133cd1e695" (UID: "f5792ec0-0d00-47e7-8d9d-d3133cd1e695"). InnerVolumeSpecName "kube-api-access-kqlvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.286076 4749 scope.go:117] "RemoveContainer" containerID="3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.289489 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" (UID: "dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.296027 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3db8c8b1-f222-46a2-9a4a-e9e48f27802c" (UID: "3db8c8b1-f222-46a2-9a4a-e9e48f27802c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.323628 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "3962a4be-25eb-45f6-8b1a-f84341319df3" (UID: "3962a4be-25eb-45f6-8b1a-f84341319df3"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.343664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5792ec0-0d00-47e7-8d9d-d3133cd1e695" (UID: "f5792ec0-0d00-47e7-8d9d-d3133cd1e695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.343707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-config-data" (OuterVolumeSpecName: "config-data") pod "f5792ec0-0d00-47e7-8d9d-d3133cd1e695" (UID: "f5792ec0-0d00-47e7-8d9d-d3133cd1e695"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.345364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3db8c8b1-f222-46a2-9a4a-e9e48f27802c" (UID: "3db8c8b1-f222-46a2-9a4a-e9e48f27802c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.362133 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-744d76c7bb-6xh5q"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.362288 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-744d76c7bb-6xh5q"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.379543 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqlvg\" (UniqueName: \"kubernetes.io/projected/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-kube-api-access-kqlvg\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.379576 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.379585 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.379593 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.379602 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5792ec0-0d00-47e7-8d9d-d3133cd1e695-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.379612 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.379623 4749 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3962a4be-25eb-45f6-8b1a-f84341319df3-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.379684 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.379730 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts podName:cd7300ea-27b0-438e-981c-7b862054e630 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:45.37971529 +0000 UTC m=+1548.551865147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts") pod "novacell08947-account-delete-x2bgc" (UID: "cd7300ea-27b0-438e-981c-7b862054e630") : configmap "openstack-scripts" not found Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.382312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-config-data" (OuterVolumeSpecName: "config-data") pod "3db8c8b1-f222-46a2-9a4a-e9e48f27802c" (UID: "3db8c8b1-f222-46a2-9a4a-e9e48f27802c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.398341 4749 scope.go:117] "RemoveContainer" containerID="d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.481818 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db8c8b1-f222-46a2-9a4a-e9e48f27802c-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.707799 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-56469f8b8-ckfjz"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.711172 4749 scope.go:117] "RemoveContainer" containerID="3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e" Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.711687 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e\": container with ID starting with 3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e not found: ID does not exist" containerID="3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.711722 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e"} err="failed to get container status \"3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e\": rpc error: code = NotFound desc = could not find container \"3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e\": container with ID starting with 3243691eca9bb1aac1e774e7e062561a8426ee6041bd677c0516360cc48f209e not found: ID does not exist" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.711748 4749 scope.go:117] "RemoveContainer" containerID="d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09" Nov 29 01:36:43 crc kubenswrapper[4749]: E1129 01:36:43.712485 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09\": container with ID starting with d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09 not found: ID does not exist" containerID="d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.712518 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09"} err="failed to get container status \"d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09\": rpc error: code = NotFound desc = could not find container \"d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09\": container with ID starting with d07bb0bc1e0dac0c6a38de75377b5128ff12b0f85d1fa99a5c4bebe90e314e09 not found: ID does not exist" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.712539 4749 scope.go:117] "RemoveContainer" containerID="5de820b5dca490141173e58c8807e85dfc695c9b0141c1364e49c90d3e7f477e" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.712756 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.728361 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-56469f8b8-ckfjz"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.750049 4749 scope.go:117] "RemoveContainer" containerID="8aabc98b71671061ad9285c52d012f1555e22985b6de832773cf911ed650a0eb" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.775221 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.787333 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.787393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-generated\") pod \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.787423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kolla-config\") pod \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.787505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bl48\" (UniqueName: \"kubernetes.io/projected/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kube-api-access-7bl48\") pod \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.787555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-operator-scripts\") pod \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.787607 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-default\") pod \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.787626 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-combined-ca-bundle\") pod \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.787650 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-galera-tls-certs\") pod \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\" (UID: \"f05059ec-0cc5-4873-8041-bb14c2fa4c53\") " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.789981 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f05059ec-0cc5-4873-8041-bb14c2fa4c53" (UID: "f05059ec-0cc5-4873-8041-bb14c2fa4c53"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.790572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f05059ec-0cc5-4873-8041-bb14c2fa4c53" (UID: "f05059ec-0cc5-4873-8041-bb14c2fa4c53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.792499 4749 scope.go:117] "RemoveContainer" containerID="ea660191ca01502179c1441bac05dbf383e8e263212f8f6a8e09dc6e221432a5" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.792953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f05059ec-0cc5-4873-8041-bb14c2fa4c53" (UID: "f05059ec-0cc5-4873-8041-bb14c2fa4c53"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.798988 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.799998 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f05059ec-0cc5-4873-8041-bb14c2fa4c53" (UID: "f05059ec-0cc5-4873-8041-bb14c2fa4c53"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.820500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kube-api-access-7bl48" (OuterVolumeSpecName: "kube-api-access-7bl48") pod "f05059ec-0cc5-4873-8041-bb14c2fa4c53" (UID: "f05059ec-0cc5-4873-8041-bb14c2fa4c53"). InnerVolumeSpecName "kube-api-access-7bl48". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.820828 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.829188 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.832764 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f05059ec-0cc5-4873-8041-bb14c2fa4c53" (UID: "f05059ec-0cc5-4873-8041-bb14c2fa4c53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.845448 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "f05059ec-0cc5-4873-8041-bb14c2fa4c53" (UID: "f05059ec-0cc5-4873-8041-bb14c2fa4c53"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.846691 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.848695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "f05059ec-0cc5-4873-8041-bb14c2fa4c53" (UID: "f05059ec-0cc5-4873-8041-bb14c2fa4c53"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.855609 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.865682 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.877155 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.887008 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.889081 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.889107 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.889116 4749 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.889126 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bl48\" (UniqueName: \"kubernetes.io/projected/f05059ec-0cc5-4873-8041-bb14c2fa4c53-kube-api-access-7bl48\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.889135 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.889143 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f05059ec-0cc5-4873-8041-bb14c2fa4c53-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.889151 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.889158 4749 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f05059ec-0cc5-4873-8041-bb14c2fa4c53-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.896094 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.898421 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c639d859-841e-4f38-a2b3-09fc3201e616/ovn-northd/0.log" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.898564 4749 generic.go:334] "Generic (PLEG): container finished" podID="c639d859-841e-4f38-a2b3-09fc3201e616" containerID="32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" exitCode=139 Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.898658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c639d859-841e-4f38-a2b3-09fc3201e616","Type":"ContainerDied","Data":"32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27"} Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.901831 4749 generic.go:334] "Generic (PLEG): container finished" podID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" containerID="30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254" exitCode=0 Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.902069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f05059ec-0cc5-4873-8041-bb14c2fa4c53","Type":"ContainerDied","Data":"30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254"} Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.902175 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78bc78f9d8-g85sc"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.902272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"f05059ec-0cc5-4873-8041-bb14c2fa4c53","Type":"ContainerDied","Data":"45788b64bbc41910f81bca78a3a12bf0287f755b49c0a7d3fdac15570d7d0c18"} Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.902397 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.912282 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-78bc78f9d8-g85sc"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.919306 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.919328 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.926111 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.928681 4749 generic.go:334] "Generic (PLEG): container finished" podID="9a9603fe-72d8-479a-86be-9b914455fba1" containerID="00208c9ed91f795bee62848698e8c46182c049d63a9adf626a2bc73dc90a56e8" exitCode=0 Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.928733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a9603fe-72d8-479a-86be-9b914455fba1","Type":"ContainerDied","Data":"00208c9ed91f795bee62848698e8c46182c049d63a9adf626a2bc73dc90a56e8"} Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.936834 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cindere578-account-delete-x9bh8" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.937004 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.937092 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi3ebb-account-delete-5jgft" podUID="8cb80122-cbde-418d-8f3f-367087068603" containerName="mariadb-account-delete" containerID="cri-o://6ab8ab3317b880f167426788d0636c2f6f74ab5861bf7c160fed8a33923e7357" gracePeriod=30 Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.937262 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance261d-account-delete-pndlc" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.937451 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.937613 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b8cf6c56c-s8qz2" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.937638 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.937659 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican7ea4-account-delete-hg54s" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.937680 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.962315 4749 scope.go:117] "RemoveContainer" containerID="30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254" Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.964856 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 01:36:43 crc kubenswrapper[4749]: I1129 01:36:43.998222 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.011862 4749 scope.go:117] "RemoveContainer" containerID="fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.023062 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.045852 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.080448 4749 scope.go:117] "RemoveContainer" containerID="30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.083314 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.090337 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.092000 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254\": container with ID starting with 30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254 not found: ID does not exist" containerID="30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.092035 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254"} err="failed to get container status \"30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254\": rpc error: code = NotFound desc = could not find container \"30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254\": container with ID starting with 30379bd830c8a72d48554f64bbe123f20a063207575aa16178d71d308ac52254 not found: ID does not exist" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.092060 4749 scope.go:117] "RemoveContainer" containerID="fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20" Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.098273 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20\": container with ID starting with fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20 not found: ID does not exist" containerID="fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.098303 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20"} err="failed to get container status \"fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20\": rpc error: code = NotFound desc = could not find container \"fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20\": container with ID starting with fb76604ff6967007ea1ef683f9bd0162012d477142846fb514ebd36d53d46c20 not found: ID does not exist" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.100303 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.107569 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cindere578-account-delete-x9bh8"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.123055 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cindere578-account-delete-x9bh8"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.131083 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b8cf6c56c-s8qz2"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.137339 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6b8cf6c56c-s8qz2"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.165252 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.168235 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.174839 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.182104 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c639d859-841e-4f38-a2b3-09fc3201e616/ovn-northd/0.log" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.182184 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.182274 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.194265 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.212240 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.213779 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican7ea4-account-delete-hg54s"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.220019 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican7ea4-account-delete-hg54s"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.230234 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.234228 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.302854 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-config\") pod \"c639d859-841e-4f38-a2b3-09fc3201e616\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.302904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg4qv\" (UniqueName: \"kubernetes.io/projected/c639d859-841e-4f38-a2b3-09fc3201e616-kube-api-access-sg4qv\") pod \"c639d859-841e-4f38-a2b3-09fc3201e616\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.302922 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-northd-tls-certs\") pod \"c639d859-841e-4f38-a2b3-09fc3201e616\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.302974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-rundir\") pod \"c639d859-841e-4f38-a2b3-09fc3201e616\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.303021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-scripts\") pod \"c639d859-841e-4f38-a2b3-09fc3201e616\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.303106 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-metrics-certs-tls-certs\") pod \"c639d859-841e-4f38-a2b3-09fc3201e616\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.303153 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-combined-ca-bundle\") pod \"c639d859-841e-4f38-a2b3-09fc3201e616\" (UID: \"c639d859-841e-4f38-a2b3-09fc3201e616\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.312711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "c639d859-841e-4f38-a2b3-09fc3201e616" (UID: "c639d859-841e-4f38-a2b3-09fc3201e616"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.312756 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-config" (OuterVolumeSpecName: "config") pod "c639d859-841e-4f38-a2b3-09fc3201e616" (UID: "c639d859-841e-4f38-a2b3-09fc3201e616"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.313119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-scripts" (OuterVolumeSpecName: "scripts") pod "c639d859-841e-4f38-a2b3-09fc3201e616" (UID: "c639d859-841e-4f38-a2b3-09fc3201e616"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.318344 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c639d859-841e-4f38-a2b3-09fc3201e616-kube-api-access-sg4qv" (OuterVolumeSpecName: "kube-api-access-sg4qv") pod "c639d859-841e-4f38-a2b3-09fc3201e616" (UID: "c639d859-841e-4f38-a2b3-09fc3201e616"). InnerVolumeSpecName "kube-api-access-sg4qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.333724 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c639d859-841e-4f38-a2b3-09fc3201e616" (UID: "c639d859-841e-4f38-a2b3-09fc3201e616"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.379605 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "c639d859-841e-4f38-a2b3-09fc3201e616" (UID: "c639d859-841e-4f38-a2b3-09fc3201e616"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.383242 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c639d859-841e-4f38-a2b3-09fc3201e616" (UID: "c639d859-841e-4f38-a2b3-09fc3201e616"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.399673 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.404998 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.405150 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg4qv\" (UniqueName: \"kubernetes.io/projected/c639d859-841e-4f38-a2b3-09fc3201e616-kube-api-access-sg4qv\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.405223 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.405280 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c639d859-841e-4f38-a2b3-09fc3201e616-ovn-rundir\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.405335 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c639d859-841e-4f38-a2b3-09fc3201e616-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.405386 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.405446 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c639d859-841e-4f38-a2b3-09fc3201e616-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.505986 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-plugins-conf\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506027 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506051 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-plugins\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9a9603fe-72d8-479a-86be-9b914455fba1-erlang-cookie-secret\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506135 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-server-conf\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-confd\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506241 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-tls\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-erlang-cookie\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzld\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-kube-api-access-lqzld\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506429 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9a9603fe-72d8-479a-86be-9b914455fba1-pod-info\") pod \"9a9603fe-72d8-479a-86be-9b914455fba1\" (UID: \"9a9603fe-72d8-479a-86be-9b914455fba1\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506819 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.506957 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.507646 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.509828 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9a9603fe-72d8-479a-86be-9b914455fba1-pod-info" (OuterVolumeSpecName: "pod-info") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.509852 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-kube-api-access-lqzld" (OuterVolumeSpecName: "kube-api-access-lqzld") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "kube-api-access-lqzld". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.512948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.513232 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a9603fe-72d8-479a-86be-9b914455fba1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.516292 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.533788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data" (OuterVolumeSpecName: "config-data") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.556904 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-server-conf" (OuterVolumeSpecName: "server-conf") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.600188 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.600330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9a9603fe-72d8-479a-86be-9b914455fba1" (UID: "9a9603fe-72d8-479a-86be-9b914455fba1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608484 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-server-conf\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608513 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608522 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608531 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608540 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608549 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzld\" (UniqueName: \"kubernetes.io/projected/9a9603fe-72d8-479a-86be-9b914455fba1-kube-api-access-lqzld\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608556 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9a9603fe-72d8-479a-86be-9b914455fba1-pod-info\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608564 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9a9603fe-72d8-479a-86be-9b914455fba1-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608588 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608596 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9a9603fe-72d8-479a-86be-9b914455fba1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.608605 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9a9603fe-72d8-479a-86be-9b914455fba1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.635316 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.640865 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-config-data\") pod \"28df216a-4f1e-449f-aaf6-45fd12929ad8\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdpzl\" (UniqueName: \"kubernetes.io/projected/28df216a-4f1e-449f-aaf6-45fd12929ad8-kube-api-access-mdpzl\") pod \"28df216a-4f1e-449f-aaf6-45fd12929ad8\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709429 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-plugins\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709475 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-server-conf\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-internal-tls-certs\") pod \"28df216a-4f1e-449f-aaf6-45fd12929ad8\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-public-tls-certs\") pod \"28df216a-4f1e-449f-aaf6-45fd12929ad8\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709575 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-plugins-conf\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709605 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-fernet-keys\") pod \"28df216a-4f1e-449f-aaf6-45fd12929ad8\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709643 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-combined-ca-bundle\") pod \"28df216a-4f1e-449f-aaf6-45fd12929ad8\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4sk4\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-kube-api-access-s4sk4\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709688 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-credential-keys\") pod \"28df216a-4f1e-449f-aaf6-45fd12929ad8\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709739 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-tls\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709759 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31a44203-fd94-4eb4-952f-d54a5c577095-erlang-cookie-secret\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709781 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-scripts\") pod \"28df216a-4f1e-449f-aaf6-45fd12929ad8\" (UID: \"28df216a-4f1e-449f-aaf6-45fd12929ad8\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-erlang-cookie\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31a44203-fd94-4eb4-952f-d54a5c577095-pod-info\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.709842 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-confd\") pod \"31a44203-fd94-4eb4-952f-d54a5c577095\" (UID: \"31a44203-fd94-4eb4-952f-d54a5c577095\") " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.710096 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.711705 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.711750 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.713108 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.713407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28df216a-4f1e-449f-aaf6-45fd12929ad8-kube-api-access-mdpzl" (OuterVolumeSpecName: "kube-api-access-mdpzl") pod "28df216a-4f1e-449f-aaf6-45fd12929ad8" (UID: "28df216a-4f1e-449f-aaf6-45fd12929ad8"). InnerVolumeSpecName "kube-api-access-mdpzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.714494 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.716384 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.719252 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "28df216a-4f1e-449f-aaf6-45fd12929ad8" (UID: "28df216a-4f1e-449f-aaf6-45fd12929ad8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.722772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/31a44203-fd94-4eb4-952f-d54a5c577095-pod-info" (OuterVolumeSpecName: "pod-info") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.727918 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-scripts" (OuterVolumeSpecName: "scripts") pod "28df216a-4f1e-449f-aaf6-45fd12929ad8" (UID: "28df216a-4f1e-449f-aaf6-45fd12929ad8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.727977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "28df216a-4f1e-449f-aaf6-45fd12929ad8" (UID: "28df216a-4f1e-449f-aaf6-45fd12929ad8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.728100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-kube-api-access-s4sk4" (OuterVolumeSpecName: "kube-api-access-s4sk4") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "kube-api-access-s4sk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.732263 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28df216a-4f1e-449f-aaf6-45fd12929ad8" (UID: "28df216a-4f1e-449f-aaf6-45fd12929ad8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.747937 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a44203-fd94-4eb4-952f-d54a5c577095-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.751664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data" (OuterVolumeSpecName: "config-data") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.760690 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-config-data" (OuterVolumeSpecName: "config-data") pod "28df216a-4f1e-449f-aaf6-45fd12929ad8" (UID: "28df216a-4f1e-449f-aaf6-45fd12929ad8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.771326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-server-conf" (OuterVolumeSpecName: "server-conf") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.776577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "28df216a-4f1e-449f-aaf6-45fd12929ad8" (UID: "28df216a-4f1e-449f-aaf6-45fd12929ad8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.785308 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "28df216a-4f1e-449f-aaf6-45fd12929ad8" (UID: "28df216a-4f1e-449f-aaf6-45fd12929ad8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811258 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811288 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdpzl\" (UniqueName: \"kubernetes.io/projected/28df216a-4f1e-449f-aaf6-45fd12929ad8-kube-api-access-mdpzl\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811300 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811308 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-server-conf\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811317 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811327 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811340 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811349 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811374 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811383 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811391 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4sk4\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-kube-api-access-s4sk4\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811399 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811406 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/31a44203-fd94-4eb4-952f-d54a5c577095-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811415 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811422 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/31a44203-fd94-4eb4-952f-d54a5c577095-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811430 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/28df216a-4f1e-449f-aaf6-45fd12929ad8-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811439 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.811446 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/31a44203-fd94-4eb4-952f-d54a5c577095-pod-info\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.825492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "31a44203-fd94-4eb4-952f-d54a5c577095" (UID: "31a44203-fd94-4eb4-952f-d54a5c577095"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.834310 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.879105 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.879694 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.879919 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.879951 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.880452 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.884457 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.885984 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.886022 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.912650 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.912679 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/31a44203-fd94-4eb4-952f-d54a5c577095-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.950394 4749 generic.go:334] "Generic (PLEG): container finished" podID="28df216a-4f1e-449f-aaf6-45fd12929ad8" containerID="41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349" exitCode=0 Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.950448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5784c8bdbd-lvrsx" event={"ID":"28df216a-4f1e-449f-aaf6-45fd12929ad8","Type":"ContainerDied","Data":"41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349"} Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.950489 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5784c8bdbd-lvrsx" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.950516 4749 scope.go:117] "RemoveContainer" containerID="41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.950502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5784c8bdbd-lvrsx" event={"ID":"28df216a-4f1e-449f-aaf6-45fd12929ad8","Type":"ContainerDied","Data":"06b139fb24804461bcd182a769a5966891a8c20fa02074406e9d9c9291d59711"} Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.960355 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c639d859-841e-4f38-a2b3-09fc3201e616/ovn-northd/0.log" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.960484 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.961429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c639d859-841e-4f38-a2b3-09fc3201e616","Type":"ContainerDied","Data":"de49a7f8016c592329b857f9d4f7dfc7e1181f8e388cac615fdbefdcd54b5aab"} Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.977398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9a9603fe-72d8-479a-86be-9b914455fba1","Type":"ContainerDied","Data":"5767aeef5b80eb49702430521e1e47fa69ca9b4cbf2afc25a9131ede7d597978"} Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.977728 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.982850 4749 generic.go:334] "Generic (PLEG): container finished" podID="31a44203-fd94-4eb4-952f-d54a5c577095" containerID="eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1" exitCode=0 Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.982902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31a44203-fd94-4eb4-952f-d54a5c577095","Type":"ContainerDied","Data":"eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1"} Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.982934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"31a44203-fd94-4eb4-952f-d54a5c577095","Type":"ContainerDied","Data":"9f0a445b3e24931289c0920fe22beb7a96551b7c1eb02e57c99c4c323e31fa6c"} Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.983113 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.989702 4749 scope.go:117] "RemoveContainer" containerID="41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.992970 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5784c8bdbd-lvrsx"] Nov 29 01:36:44 crc kubenswrapper[4749]: E1129 01:36:44.993389 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349\": container with ID starting with 41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349 not found: ID does not exist" containerID="41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.993446 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349"} err="failed to get container status \"41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349\": rpc error: code = NotFound desc = could not find container \"41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349\": container with ID starting with 41c38c234e7ee42b7617ba5fd9b85bed7734ca7897cd58458309502c1f7cf349 not found: ID does not exist" Nov 29 01:36:44 crc kubenswrapper[4749]: I1129 01:36:44.993469 4749 scope.go:117] "RemoveContainer" containerID="54a83438d867dacb43afd491866da3b4e42fc03fc4d37a58d1393a675b29c458" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.000686 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5784c8bdbd-lvrsx"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.020625 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.027647 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.028047 4749 scope.go:117] "RemoveContainer" containerID="32762a928fa67597294813b16786801783c4b4baa964ebb4bdf0d12c207bac27" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.057760 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.065409 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.077795 4749 scope.go:117] "RemoveContainer" containerID="00208c9ed91f795bee62848698e8c46182c049d63a9adf626a2bc73dc90a56e8" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.117395 4749 scope.go:117] "RemoveContainer" containerID="d15407c83f3311f8c3958f6e4bd3c9da53a4bea36613017566ab26cfa4a60437" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.118091 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a342ef2-ab29-4277-a7b4-0f83e5c3ca21" path="/var/lib/kubelet/pods/1a342ef2-ab29-4277-a7b4-0f83e5c3ca21/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.118840 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc6d8b8-4291-4f54-8bb2-508933b39c5a" path="/var/lib/kubelet/pods/1bc6d8b8-4291-4f54-8bb2-508933b39c5a/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.120927 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26a4b5e6-f82a-4316-a7e8-d596136086c2" path="/var/lib/kubelet/pods/26a4b5e6-f82a-4316-a7e8-d596136086c2/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.123618 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28df216a-4f1e-449f-aaf6-45fd12929ad8" path="/var/lib/kubelet/pods/28df216a-4f1e-449f-aaf6-45fd12929ad8/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.124631 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" path="/var/lib/kubelet/pods/31a44203-fd94-4eb4-952f-d54a5c577095/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.125503 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3962a4be-25eb-45f6-8b1a-f84341319df3" path="/var/lib/kubelet/pods/3962a4be-25eb-45f6-8b1a-f84341319df3/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.127110 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" path="/var/lib/kubelet/pods/3b929be5-bc3e-47c6-8ac4-0ada6d740a7f/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.128052 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" path="/var/lib/kubelet/pods/3db8c8b1-f222-46a2-9a4a-e9e48f27802c/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.129693 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" path="/var/lib/kubelet/pods/6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.130899 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" path="/var/lib/kubelet/pods/70f9e6e2-7ed0-4977-a558-27fb6a9d4001/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.131568 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" path="/var/lib/kubelet/pods/8715ecba-f08b-4fcd-b129-d9e9c568e087/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.132672 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9cf96c-6bdd-425d-8983-4bfa2250edda" path="/var/lib/kubelet/pods/bf9cf96c-6bdd-425d-8983-4bfa2250edda/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.133503 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c49a0e83-3d6d-47e1-8ca4-4cae46d23168" path="/var/lib/kubelet/pods/c49a0e83-3d6d-47e1-8ca4-4cae46d23168/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.134162 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" path="/var/lib/kubelet/pods/c639d859-841e-4f38-a2b3-09fc3201e616/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.135187 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" path="/var/lib/kubelet/pods/d74c0b83-feac-4f92-b5cf-1ec7c9e298a4/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.135843 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" path="/var/lib/kubelet/pods/dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.136540 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" path="/var/lib/kubelet/pods/e3d65a6e-107e-4ffe-8561-49b1dd2e9aba/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.137634 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" path="/var/lib/kubelet/pods/e422f911-d2a1-48ac-9ad7-9394647ad23c/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.138326 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" path="/var/lib/kubelet/pods/f05059ec-0cc5-4873-8041-bb14c2fa4c53/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.139097 4749 scope.go:117] "RemoveContainer" containerID="eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.139451 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5792ec0-0d00-47e7-8d9d-d3133cd1e695" path="/var/lib/kubelet/pods/f5792ec0-0d00-47e7-8d9d-d3133cd1e695/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.140470 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91885d5-497a-41e4-9796-ca25f184b178" path="/var/lib/kubelet/pods/f91885d5-497a-41e4-9796-ca25f184b178/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.141274 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" path="/var/lib/kubelet/pods/ff73098b-8f03-43ac-9e1d-3ac7edd2589d/volumes" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.142522 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jm2kl"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.142553 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jm2kl"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.142568 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.142581 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.142596 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-261d-account-create-update-j6mzs"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.142607 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-261d-account-create-update-j6mzs"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.148351 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance261d-account-delete-pndlc"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.155057 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance261d-account-delete-pndlc"] Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.158369 4749 scope.go:117] "RemoveContainer" containerID="b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.176677 4749 scope.go:117] "RemoveContainer" containerID="eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1" Nov 29 01:36:45 crc kubenswrapper[4749]: E1129 01:36:45.177093 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1\": container with ID starting with eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1 not found: ID does not exist" containerID="eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.177148 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1"} err="failed to get container status \"eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1\": rpc error: code = NotFound desc = could not find container \"eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1\": container with ID starting with eef74f96a01b092b581681de9b8a508d009d627d8dffb9320229bccef67949d1 not found: ID does not exist" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.177180 4749 scope.go:117] "RemoveContainer" containerID="b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e" Nov 29 01:36:45 crc kubenswrapper[4749]: E1129 01:36:45.177430 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e\": container with ID starting with b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e not found: ID does not exist" containerID="b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.177455 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e"} err="failed to get container status \"b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e\": rpc error: code = NotFound desc = could not find container \"b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e\": container with ID starting with b785ba8e5642c4da00b2cd1fbe1c43db8e2b4914e9414aa1e94006783dfe688e not found: ID does not exist" Nov 29 01:36:45 crc kubenswrapper[4749]: E1129 01:36:45.221821 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:45 crc kubenswrapper[4749]: E1129 01:36:45.221970 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts podName:8cb80122-cbde-418d-8f3f-367087068603 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:49.221938992 +0000 UTC m=+1552.394088909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts") pod "novaapi3ebb-account-delete-5jgft" (UID: "8cb80122-cbde-418d-8f3f-367087068603") : configmap "openstack-scripts" not found Nov 29 01:36:45 crc kubenswrapper[4749]: E1129 01:36:45.322984 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:45 crc kubenswrapper[4749]: E1129 01:36:45.323090 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts podName:d5134b41-98a9-4555-9adb-c988862f59e6 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:49.323065951 +0000 UTC m=+1552.495215828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts") pod "neutron5123-account-delete-qmfnv" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6") : configmap "openstack-scripts" not found Nov 29 01:36:45 crc kubenswrapper[4749]: E1129 01:36:45.424548 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:45 crc kubenswrapper[4749]: E1129 01:36:45.424691 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts podName:cd7300ea-27b0-438e-981c-7b862054e630 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:49.424666961 +0000 UTC m=+1552.596816818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts") pod "novacell08947-account-delete-x2bgc" (UID: "cd7300ea-27b0-438e-981c-7b862054e630") : configmap "openstack-scripts" not found Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.769902 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.830599 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-httpd-config\") pod \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.830674 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwvv5\" (UniqueName: \"kubernetes.io/projected/04501dca-4c62-4065-abb5-fbdfb9ce76fc-kube-api-access-gwvv5\") pod \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.830745 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-ovndb-tls-certs\") pod \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.830805 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-config\") pod \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.830841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-public-tls-certs\") pod \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.830865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-combined-ca-bundle\") pod \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.830896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-internal-tls-certs\") pod \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\" (UID: \"04501dca-4c62-4065-abb5-fbdfb9ce76fc\") " Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.836360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04501dca-4c62-4065-abb5-fbdfb9ce76fc-kube-api-access-gwvv5" (OuterVolumeSpecName: "kube-api-access-gwvv5") pod "04501dca-4c62-4065-abb5-fbdfb9ce76fc" (UID: "04501dca-4c62-4065-abb5-fbdfb9ce76fc"). InnerVolumeSpecName "kube-api-access-gwvv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.837501 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "04501dca-4c62-4065-abb5-fbdfb9ce76fc" (UID: "04501dca-4c62-4065-abb5-fbdfb9ce76fc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.875572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-config" (OuterVolumeSpecName: "config") pod "04501dca-4c62-4065-abb5-fbdfb9ce76fc" (UID: "04501dca-4c62-4065-abb5-fbdfb9ce76fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.878175 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "04501dca-4c62-4065-abb5-fbdfb9ce76fc" (UID: "04501dca-4c62-4065-abb5-fbdfb9ce76fc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.887367 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04501dca-4c62-4065-abb5-fbdfb9ce76fc" (UID: "04501dca-4c62-4065-abb5-fbdfb9ce76fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.895772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "04501dca-4c62-4065-abb5-fbdfb9ce76fc" (UID: "04501dca-4c62-4065-abb5-fbdfb9ce76fc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.910311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "04501dca-4c62-4065-abb5-fbdfb9ce76fc" (UID: "04501dca-4c62-4065-abb5-fbdfb9ce76fc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.932578 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwvv5\" (UniqueName: \"kubernetes.io/projected/04501dca-4c62-4065-abb5-fbdfb9ce76fc-kube-api-access-gwvv5\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.932628 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.932644 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.932662 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.932678 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.932692 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.932703 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/04501dca-4c62-4065-abb5-fbdfb9ce76fc-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 29 01:36:45 crc kubenswrapper[4749]: I1129 01:36:45.942180 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.160:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.000187 4749 generic.go:334] "Generic (PLEG): container finished" podID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerID="87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d" exitCode=0 Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.000251 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b4fcb8ff-s4n9j" event={"ID":"04501dca-4c62-4065-abb5-fbdfb9ce76fc","Type":"ContainerDied","Data":"87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d"} Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.000277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-75b4fcb8ff-s4n9j" event={"ID":"04501dca-4c62-4065-abb5-fbdfb9ce76fc","Type":"ContainerDied","Data":"ab7a0c1ee75630fab0b25790d247b8e662bf1512dec0c4315a9cb9eb4a24637d"} Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.000300 4749 scope.go:117] "RemoveContainer" containerID="12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10" Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.000413 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-75b4fcb8ff-s4n9j" Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.029653 4749 scope.go:117] "RemoveContainer" containerID="87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d" Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.045604 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-75b4fcb8ff-s4n9j"] Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.054490 4749 scope.go:117] "RemoveContainer" containerID="12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10" Nov 29 01:36:46 crc kubenswrapper[4749]: E1129 01:36:46.054863 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10\": container with ID starting with 12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10 not found: ID does not exist" containerID="12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10" Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.054911 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10"} err="failed to get container status \"12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10\": rpc error: code = NotFound desc = could not find container \"12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10\": container with ID starting with 12d22071dc420f131d5179182bf85f97e8cc5138dba95f471788dcc75fee0c10 not found: ID does not exist" Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.054938 4749 scope.go:117] "RemoveContainer" containerID="87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d" Nov 29 01:36:46 crc kubenswrapper[4749]: E1129 01:36:46.055287 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d\": container with ID starting with 87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d not found: ID does not exist" containerID="87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d" Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.055339 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d"} err="failed to get container status \"87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d\": rpc error: code = NotFound desc = could not find container \"87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d\": container with ID starting with 87111963349e754041224dec3ee96ec0b5915d9a24bb86f802d2a92b98cf9c1d not found: ID does not exist" Nov 29 01:36:46 crc kubenswrapper[4749]: I1129 01:36:46.057873 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-75b4fcb8ff-s4n9j"] Nov 29 01:36:47 crc kubenswrapper[4749]: I1129 01:36:47.091722 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" path="/var/lib/kubelet/pods/04501dca-4c62-4065-abb5-fbdfb9ce76fc/volumes" Nov 29 01:36:47 crc kubenswrapper[4749]: I1129 01:36:47.092421 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="132885d9-92b8-4cf2-b8e2-7f365fd0d020" path="/var/lib/kubelet/pods/132885d9-92b8-4cf2-b8e2-7f365fd0d020/volumes" Nov 29 01:36:47 crc kubenswrapper[4749]: I1129 01:36:47.093014 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a17ee11-c21e-4edd-b1cf-8ba27bae166c" path="/var/lib/kubelet/pods/2a17ee11-c21e-4edd-b1cf-8ba27bae166c/volumes" Nov 29 01:36:47 crc kubenswrapper[4749]: I1129 01:36:47.094363 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" path="/var/lib/kubelet/pods/9a9603fe-72d8-479a-86be-9b914455fba1/volumes" Nov 29 01:36:47 crc kubenswrapper[4749]: I1129 01:36:47.095020 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe7c0f3-6d3d-4117-af54-c7c79fd09d47" path="/var/lib/kubelet/pods/bfe7c0f3-6d3d-4117-af54-c7c79fd09d47/volumes" Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.289353 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.289749 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts podName:8cb80122-cbde-418d-8f3f-367087068603 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:57.289721123 +0000 UTC m=+1560.461871020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts") pod "novaapi3ebb-account-delete-5jgft" (UID: "8cb80122-cbde-418d-8f3f-367087068603") : configmap "openstack-scripts" not found Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.391377 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.391501 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts podName:d5134b41-98a9-4555-9adb-c988862f59e6 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:57.391472387 +0000 UTC m=+1560.563622284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts") pod "neutron5123-account-delete-qmfnv" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6") : configmap "openstack-scripts" not found Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.492792 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.492949 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts podName:cd7300ea-27b0-438e-981c-7b862054e630 nodeName:}" failed. No retries permitted until 2025-11-29 01:36:57.492919963 +0000 UTC m=+1560.665069860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts") pod "novacell08947-account-delete-x2bgc" (UID: "cd7300ea-27b0-438e-981c-7b862054e630") : configmap "openstack-scripts" not found Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.878388 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.878843 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.879128 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.879185 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.879274 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.880856 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.884749 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:49 crc kubenswrapper[4749]: E1129 01:36:49.884790 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:36:54 crc kubenswrapper[4749]: E1129 01:36:54.878569 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:54 crc kubenswrapper[4749]: E1129 01:36:54.880585 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:54 crc kubenswrapper[4749]: E1129 01:36:54.880734 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:54 crc kubenswrapper[4749]: E1129 01:36:54.882071 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:54 crc kubenswrapper[4749]: E1129 01:36:54.882123 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:36:54 crc kubenswrapper[4749]: E1129 01:36:54.883685 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:54 crc kubenswrapper[4749]: E1129 01:36:54.892643 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:54 crc kubenswrapper[4749]: E1129 01:36:54.892713 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:36:55 crc kubenswrapper[4749]: I1129 01:36:55.076600 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:36:55 crc kubenswrapper[4749]: E1129 01:36:55.077057 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:36:57 crc kubenswrapper[4749]: E1129 01:36:57.361297 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:57 crc kubenswrapper[4749]: E1129 01:36:57.362598 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts podName:8cb80122-cbde-418d-8f3f-367087068603 nodeName:}" failed. No retries permitted until 2025-11-29 01:37:13.362546649 +0000 UTC m=+1576.534696546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts") pod "novaapi3ebb-account-delete-5jgft" (UID: "8cb80122-cbde-418d-8f3f-367087068603") : configmap "openstack-scripts" not found Nov 29 01:36:57 crc kubenswrapper[4749]: E1129 01:36:57.463477 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:57 crc kubenswrapper[4749]: E1129 01:36:57.463602 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts podName:d5134b41-98a9-4555-9adb-c988862f59e6 nodeName:}" failed. No retries permitted until 2025-11-29 01:37:13.463576195 +0000 UTC m=+1576.635726082 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts") pod "neutron5123-account-delete-qmfnv" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6") : configmap "openstack-scripts" not found Nov 29 01:36:57 crc kubenswrapper[4749]: E1129 01:36:57.567274 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:36:57 crc kubenswrapper[4749]: E1129 01:36:57.567390 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts podName:cd7300ea-27b0-438e-981c-7b862054e630 nodeName:}" failed. No retries permitted until 2025-11-29 01:37:13.567366071 +0000 UTC m=+1576.739515968 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts") pod "novacell08947-account-delete-x2bgc" (UID: "cd7300ea-27b0-438e-981c-7b862054e630") : configmap "openstack-scripts" not found Nov 29 01:36:59 crc kubenswrapper[4749]: E1129 01:36:59.878429 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:59 crc kubenswrapper[4749]: E1129 01:36:59.879541 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:59 crc kubenswrapper[4749]: E1129 01:36:59.880023 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:36:59 crc kubenswrapper[4749]: E1129 01:36:59.880085 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:36:59 crc kubenswrapper[4749]: E1129 01:36:59.880309 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:59 crc kubenswrapper[4749]: E1129 01:36:59.881764 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:59 crc kubenswrapper[4749]: E1129 01:36:59.883850 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:36:59 crc kubenswrapper[4749]: E1129 01:36:59.883962 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:37:04 crc kubenswrapper[4749]: E1129 01:37:04.879019 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:37:04 crc kubenswrapper[4749]: E1129 01:37:04.881093 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:37:04 crc kubenswrapper[4749]: E1129 01:37:04.881261 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:37:04 crc kubenswrapper[4749]: E1129 01:37:04.881604 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 29 01:37:04 crc kubenswrapper[4749]: E1129 01:37:04.881673 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:37:04 crc kubenswrapper[4749]: E1129 01:37:04.885350 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:37:04 crc kubenswrapper[4749]: E1129 01:37:04.888663 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 29 01:37:04 crc kubenswrapper[4749]: E1129 01:37:04.888783 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-2k6f9" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.279367 4749 generic.go:334] "Generic (PLEG): container finished" podID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerID="894dac0e55c5b8303d062dc9a73ff359863207502ac50a567d5a516b5044255f" exitCode=137 Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.279475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"894dac0e55c5b8303d062dc9a73ff359863207502ac50a567d5a516b5044255f"} Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.787596 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.943568 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zt7ns\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-kube-api-access-zt7ns\") pod \"9faf9cff-1e0a-4d87-b75e-8899450678a4\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.943804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") pod \"9faf9cff-1e0a-4d87-b75e-8899450678a4\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.943991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-lock\") pod \"9faf9cff-1e0a-4d87-b75e-8899450678a4\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.944112 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9faf9cff-1e0a-4d87-b75e-8899450678a4\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.944305 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-cache\") pod \"9faf9cff-1e0a-4d87-b75e-8899450678a4\" (UID: \"9faf9cff-1e0a-4d87-b75e-8899450678a4\") " Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.945681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-lock" (OuterVolumeSpecName: "lock") pod "9faf9cff-1e0a-4d87-b75e-8899450678a4" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.945915 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-cache" (OuterVolumeSpecName: "cache") pod "9faf9cff-1e0a-4d87-b75e-8899450678a4" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.946224 4749 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-cache\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.946271 4749 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9faf9cff-1e0a-4d87-b75e-8899450678a4-lock\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.952492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9faf9cff-1e0a-4d87-b75e-8899450678a4" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.952911 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "9faf9cff-1e0a-4d87-b75e-8899450678a4" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 29 01:37:06 crc kubenswrapper[4749]: I1129 01:37:06.953282 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-kube-api-access-zt7ns" (OuterVolumeSpecName: "kube-api-access-zt7ns") pod "9faf9cff-1e0a-4d87-b75e-8899450678a4" (UID: "9faf9cff-1e0a-4d87-b75e-8899450678a4"). InnerVolumeSpecName "kube-api-access-zt7ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.047883 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zt7ns\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-kube-api-access-zt7ns\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.047918 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9faf9cff-1e0a-4d87-b75e-8899450678a4-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.047947 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.061721 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.149781 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.304220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9faf9cff-1e0a-4d87-b75e-8899450678a4","Type":"ContainerDied","Data":"b4e612f85ce7e3f4849419fc2aed4f1ce08e272769ae1279cbbea6357590e912"} Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.304284 4749 scope.go:117] "RemoveContainer" containerID="894dac0e55c5b8303d062dc9a73ff359863207502ac50a567d5a516b5044255f" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.304577 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.311131 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2k6f9_c7f827fe-55e8-4f5d-a074-bd79f9029382/ovs-vswitchd/0.log" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.312518 4749 generic.go:334] "Generic (PLEG): container finished" podID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" exitCode=137 Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.312567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2k6f9" event={"ID":"c7f827fe-55e8-4f5d-a074-bd79f9029382","Type":"ContainerDied","Data":"cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952"} Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.370634 4749 scope.go:117] "RemoveContainer" containerID="efa94720f5ca79ab7d9121540b501d8e5d9310c95e8baf7e219e6f4cee72dadd" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.373490 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.380786 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.398700 4749 scope.go:117] "RemoveContainer" containerID="280811dcf44a523f325934a8c2570ea8ae0d344113018439d508fa6efc5324ec" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.423238 4749 scope.go:117] "RemoveContainer" containerID="a78d3a0a66909398264af9304c9055f1ddb5c500bf5846177971f56a8ebc2113" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.446639 4749 scope.go:117] "RemoveContainer" containerID="e9c55bc5cd269128e765337cd53c4eb2aa665fc0070c9dbb2cddb9feb42c9d56" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.470785 4749 scope.go:117] "RemoveContainer" containerID="2153b28d7b1ae703a650e64e126179253f7846999b9a6400cfd009a599bbb246" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.491529 4749 scope.go:117] "RemoveContainer" containerID="bfafc82b272a3145020f82bc16f80a2708db4958f657da2553e53da41914e8a6" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.514579 4749 scope.go:117] "RemoveContainer" containerID="9f220385e1bef364c02962b269f0e2db7e6230aaf50d37f52f736cd72a640af1" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.544706 4749 scope.go:117] "RemoveContainer" containerID="c135e668a9588d4d539c1f9ecce850efa84d0c9cf2ace29a444e0f2d6c4d4e1d" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.575937 4749 scope.go:117] "RemoveContainer" containerID="5cd4b93f19f43d8c22124c663685c2a5b7c3218c0e81bb3c102046e182dec0d8" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.608974 4749 scope.go:117] "RemoveContainer" containerID="e391c22d2e560a1daaa903407899dba8b65a96672ec1d83469d63ccacf47014a" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.635918 4749 scope.go:117] "RemoveContainer" containerID="c6544b18ba1c97f83dddee9f06c12698f0b180173fc59826441ce3ebe0d76ccb" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.660667 4749 scope.go:117] "RemoveContainer" containerID="f9bfeddd31df51ff31ba3def5c4f3f2e8ba2fc1efc9f023e76e32fba61e40263" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.687776 4749 scope.go:117] "RemoveContainer" containerID="e28b38f1e7521875f74819f009a7406efa1e468522d52a3aae45ded543cc2908" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.716431 4749 scope.go:117] "RemoveContainer" containerID="019a47cff718db2d0002a6650c9afb4a966ae40b1aef5de5b8fac16e7100d973" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.801094 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2k6f9_c7f827fe-55e8-4f5d-a074-bd79f9029382/ovs-vswitchd/0.log" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.802313 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.869973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-run\") pod \"c7f827fe-55e8-4f5d-a074-bd79f9029382\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.870080 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-run" (OuterVolumeSpecName: "var-run") pod "c7f827fe-55e8-4f5d-a074-bd79f9029382" (UID: "c7f827fe-55e8-4f5d-a074-bd79f9029382"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.870138 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-lib\") pod \"c7f827fe-55e8-4f5d-a074-bd79f9029382\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.870302 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f827fe-55e8-4f5d-a074-bd79f9029382-scripts\") pod \"c7f827fe-55e8-4f5d-a074-bd79f9029382\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.870568 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-lib" (OuterVolumeSpecName: "var-lib") pod "c7f827fe-55e8-4f5d-a074-bd79f9029382" (UID: "c7f827fe-55e8-4f5d-a074-bd79f9029382"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.870966 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-run\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.870990 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-lib\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.871723 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f827fe-55e8-4f5d-a074-bd79f9029382-scripts" (OuterVolumeSpecName: "scripts") pod "c7f827fe-55e8-4f5d-a074-bd79f9029382" (UID: "c7f827fe-55e8-4f5d-a074-bd79f9029382"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.971706 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc7dq\" (UniqueName: \"kubernetes.io/projected/c7f827fe-55e8-4f5d-a074-bd79f9029382-kube-api-access-pc7dq\") pod \"c7f827fe-55e8-4f5d-a074-bd79f9029382\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.971777 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-etc-ovs\") pod \"c7f827fe-55e8-4f5d-a074-bd79f9029382\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.971886 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-log\") pod \"c7f827fe-55e8-4f5d-a074-bd79f9029382\" (UID: \"c7f827fe-55e8-4f5d-a074-bd79f9029382\") " Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.972284 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "c7f827fe-55e8-4f5d-a074-bd79f9029382" (UID: "c7f827fe-55e8-4f5d-a074-bd79f9029382"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.972412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-log" (OuterVolumeSpecName: "var-log") pod "c7f827fe-55e8-4f5d-a074-bd79f9029382" (UID: "c7f827fe-55e8-4f5d-a074-bd79f9029382"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.972509 4749 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-etc-ovs\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.972533 4749 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c7f827fe-55e8-4f5d-a074-bd79f9029382-var-log\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.972551 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f827fe-55e8-4f5d-a074-bd79f9029382-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:07 crc kubenswrapper[4749]: I1129 01:37:07.977092 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f827fe-55e8-4f5d-a074-bd79f9029382-kube-api-access-pc7dq" (OuterVolumeSpecName: "kube-api-access-pc7dq") pod "c7f827fe-55e8-4f5d-a074-bd79f9029382" (UID: "c7f827fe-55e8-4f5d-a074-bd79f9029382"). InnerVolumeSpecName "kube-api-access-pc7dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.073576 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc7dq\" (UniqueName: \"kubernetes.io/projected/c7f827fe-55e8-4f5d-a074-bd79f9029382-kube-api-access-pc7dq\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.329976 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2k6f9_c7f827fe-55e8-4f5d-a074-bd79f9029382/ovs-vswitchd/0.log" Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.331021 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2k6f9" event={"ID":"c7f827fe-55e8-4f5d-a074-bd79f9029382","Type":"ContainerDied","Data":"51f0a3d911779c645288199d5d3988fea18cd238d365c97137ef25a953555fbd"} Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.331096 4749 scope.go:117] "RemoveContainer" containerID="cae0e593c74d08a5be2b4518b0bd479ef36cb70ca9493dfe792d88c319b2e952" Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.331109 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2k6f9" Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.383041 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-2k6f9"] Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.393123 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-2k6f9"] Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.410211 4749 scope.go:117] "RemoveContainer" containerID="6f3a1d2c7d0ad0fa929c61e19de736b75f96d16f2132a4e19e56420ee60ce8c9" Nov 29 01:37:08 crc kubenswrapper[4749]: I1129 01:37:08.475700 4749 scope.go:117] "RemoveContainer" containerID="b40e601df6ea3022f7b2ce3ff2a8ce5c29d60d4fcad9dbdc9f6bcfff8e5e9b2c" Nov 29 01:37:09 crc kubenswrapper[4749]: I1129 01:37:09.094036 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" path="/var/lib/kubelet/pods/9faf9cff-1e0a-4d87-b75e-8899450678a4/volumes" Nov 29 01:37:09 crc kubenswrapper[4749]: I1129 01:37:09.097872 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" path="/var/lib/kubelet/pods/c7f827fe-55e8-4f5d-a074-bd79f9029382/volumes" Nov 29 01:37:10 crc kubenswrapper[4749]: I1129 01:37:10.075566 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:37:10 crc kubenswrapper[4749]: E1129 01:37:10.075806 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.080693 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.133974 4749 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod76910f08-d491-4b48-9439-78baad6ac3d3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod76910f08-d491-4b48-9439-78baad6ac3d3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod76910f08_d491_4b48_9439_78baad6ac3d3.slice" Nov 29 01:37:12 crc kubenswrapper[4749]: E1129 01:37:12.134048 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod76910f08-d491-4b48-9439-78baad6ac3d3] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod76910f08-d491-4b48-9439-78baad6ac3d3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod76910f08_d491_4b48_9439_78baad6ac3d3.slice" pod="openstack/swift-proxy-84b776bb8c-llx6x" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.159548 4749 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5906d408-1c10-4c55-a07b-f94d302a08c6"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5906d408-1c10-4c55-a07b-f94d302a08c6] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5906d408_1c10_4c55_a07b_f94d302a08c6.slice" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.255794 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a220429-0482-4578-9268-d4127f8da9af-operator-scripts\") pod \"7a220429-0482-4578-9268-d4127f8da9af\" (UID: \"7a220429-0482-4578-9268-d4127f8da9af\") " Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.255935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj8mf\" (UniqueName: \"kubernetes.io/projected/7a220429-0482-4578-9268-d4127f8da9af-kube-api-access-rj8mf\") pod \"7a220429-0482-4578-9268-d4127f8da9af\" (UID: \"7a220429-0482-4578-9268-d4127f8da9af\") " Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.256873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a220429-0482-4578-9268-d4127f8da9af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a220429-0482-4578-9268-d4127f8da9af" (UID: "7a220429-0482-4578-9268-d4127f8da9af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.264162 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a220429-0482-4578-9268-d4127f8da9af-kube-api-access-rj8mf" (OuterVolumeSpecName: "kube-api-access-rj8mf") pod "7a220429-0482-4578-9268-d4127f8da9af" (UID: "7a220429-0482-4578-9268-d4127f8da9af"). InnerVolumeSpecName "kube-api-access-rj8mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.357836 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj8mf\" (UniqueName: \"kubernetes.io/projected/7a220429-0482-4578-9268-d4127f8da9af-kube-api-access-rj8mf\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.357902 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a220429-0482-4578-9268-d4127f8da9af-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.390888 4749 generic.go:334] "Generic (PLEG): container finished" podID="7a220429-0482-4578-9268-d4127f8da9af" containerID="5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47" exitCode=137 Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.390946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement7db1-account-delete-x9f9k" event={"ID":"7a220429-0482-4578-9268-d4127f8da9af","Type":"ContainerDied","Data":"5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47"} Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.390982 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement7db1-account-delete-x9f9k" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.391023 4749 scope.go:117] "RemoveContainer" containerID="5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.391007 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-84b776bb8c-llx6x" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.391004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement7db1-account-delete-x9f9k" event={"ID":"7a220429-0482-4578-9268-d4127f8da9af","Type":"ContainerDied","Data":"7f74bb0b5360f1feba86f4744283da2bcdd838d7e7ad8f935e865bede7d3fa8c"} Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.438812 4749 scope.go:117] "RemoveContainer" containerID="5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47" Nov 29 01:37:12 crc kubenswrapper[4749]: E1129 01:37:12.439353 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47\": container with ID starting with 5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47 not found: ID does not exist" containerID="5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.439606 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47"} err="failed to get container status \"5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47\": rpc error: code = NotFound desc = could not find container \"5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47\": container with ID starting with 5739a4aea459d43972cca070a243a2402aa4df9d43cfe6a90241cd9e2b8b8e47 not found: ID does not exist" Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.467419 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-84b776bb8c-llx6x"] Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.477789 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-84b776bb8c-llx6x"] Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.486788 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement7db1-account-delete-x9f9k"] Nov 29 01:37:12 crc kubenswrapper[4749]: I1129 01:37:12.493411 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement7db1-account-delete-x9f9k"] Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.102982 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" path="/var/lib/kubelet/pods/76910f08-d491-4b48-9439-78baad6ac3d3/volumes" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.104956 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a220429-0482-4578-9268-d4127f8da9af" path="/var/lib/kubelet/pods/7a220429-0482-4578-9268-d4127f8da9af/volumes" Nov 29 01:37:13 crc kubenswrapper[4749]: E1129 01:37:13.154865 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5134b41_98a9_4555_9adb_c988862f59e6.slice/crio-conmon-27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f.scope\": RecentStats: unable to find data in memory cache]" Nov 29 01:37:13 crc kubenswrapper[4749]: E1129 01:37:13.375428 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:37:13 crc kubenswrapper[4749]: E1129 01:37:13.375728 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts podName:8cb80122-cbde-418d-8f3f-367087068603 nodeName:}" failed. No retries permitted until 2025-11-29 01:37:45.375713956 +0000 UTC m=+1608.547863813 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts") pod "novaapi3ebb-account-delete-5jgft" (UID: "8cb80122-cbde-418d-8f3f-367087068603") : configmap "openstack-scripts" not found Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.396876 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.401963 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.402603 4749 generic.go:334] "Generic (PLEG): container finished" podID="d5134b41-98a9-4555-9adb-c988862f59e6" containerID="27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f" exitCode=137 Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.402677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5123-account-delete-qmfnv" event={"ID":"d5134b41-98a9-4555-9adb-c988862f59e6","Type":"ContainerDied","Data":"27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f"} Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.402722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron5123-account-delete-qmfnv" event={"ID":"d5134b41-98a9-4555-9adb-c988862f59e6","Type":"ContainerDied","Data":"6e56cdb4e94033244a20a4367d016650c4668ba9f08cec1c4f19b24a67ba9cc9"} Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.402743 4749 scope.go:117] "RemoveContainer" containerID="27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.405359 4749 generic.go:334] "Generic (PLEG): container finished" podID="cd7300ea-27b0-438e-981c-7b862054e630" containerID="11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699" exitCode=137 Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.405412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell08947-account-delete-x2bgc" event={"ID":"cd7300ea-27b0-438e-981c-7b862054e630","Type":"ContainerDied","Data":"11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699"} Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.405432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell08947-account-delete-x2bgc" event={"ID":"cd7300ea-27b0-438e-981c-7b862054e630","Type":"ContainerDied","Data":"ef8f8db098b5bc8512d86217bb4e3b30090a13e3fae92a174b5f6fba0e25561b"} Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.405479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell08947-account-delete-x2bgc" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.442762 4749 scope.go:117] "RemoveContainer" containerID="27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f" Nov 29 01:37:13 crc kubenswrapper[4749]: E1129 01:37:13.443253 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f\": container with ID starting with 27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f not found: ID does not exist" containerID="27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.443288 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f"} err="failed to get container status \"27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f\": rpc error: code = NotFound desc = could not find container \"27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f\": container with ID starting with 27d9df9148fe7d02ca2828b0c9b00e6402e921898f20d7336ec731e4b72c7f3f not found: ID does not exist" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.443314 4749 scope.go:117] "RemoveContainer" containerID="11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.464904 4749 scope.go:117] "RemoveContainer" containerID="11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699" Nov 29 01:37:13 crc kubenswrapper[4749]: E1129 01:37:13.465391 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699\": container with ID starting with 11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699 not found: ID does not exist" containerID="11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.465422 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699"} err="failed to get container status \"11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699\": rpc error: code = NotFound desc = could not find container \"11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699\": container with ID starting with 11cbdde252b5a26a0a0876279ae5caa8ddf201a33fc143ce62fa092a6272f699 not found: ID does not exist" Nov 29 01:37:13 crc kubenswrapper[4749]: E1129 01:37:13.476606 4749 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 29 01:37:13 crc kubenswrapper[4749]: E1129 01:37:13.476667 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts podName:d5134b41-98a9-4555-9adb-c988862f59e6 nodeName:}" failed. No retries permitted until 2025-11-29 01:37:45.47665398 +0000 UTC m=+1608.648803837 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts") pod "neutron5123-account-delete-qmfnv" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6") : configmap "openstack-scripts" not found Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.577044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts\") pod \"d5134b41-98a9-4555-9adb-c988862f59e6\" (UID: \"d5134b41-98a9-4555-9adb-c988862f59e6\") " Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.577193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts\") pod \"cd7300ea-27b0-438e-981c-7b862054e630\" (UID: \"cd7300ea-27b0-438e-981c-7b862054e630\") " Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.577358 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbkkp\" (UniqueName: \"kubernetes.io/projected/cd7300ea-27b0-438e-981c-7b862054e630-kube-api-access-sbkkp\") pod \"cd7300ea-27b0-438e-981c-7b862054e630\" (UID: \"cd7300ea-27b0-438e-981c-7b862054e630\") " Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.577434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q26h\" (UniqueName: \"kubernetes.io/projected/d5134b41-98a9-4555-9adb-c988862f59e6-kube-api-access-7q26h\") pod \"d5134b41-98a9-4555-9adb-c988862f59e6\" (UID: \"d5134b41-98a9-4555-9adb-c988862f59e6\") " Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.577991 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5134b41-98a9-4555-9adb-c988862f59e6" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.578435 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd7300ea-27b0-438e-981c-7b862054e630" (UID: "cd7300ea-27b0-438e-981c-7b862054e630"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.582555 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7300ea-27b0-438e-981c-7b862054e630-kube-api-access-sbkkp" (OuterVolumeSpecName: "kube-api-access-sbkkp") pod "cd7300ea-27b0-438e-981c-7b862054e630" (UID: "cd7300ea-27b0-438e-981c-7b862054e630"). InnerVolumeSpecName "kube-api-access-sbkkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.584242 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5134b41-98a9-4555-9adb-c988862f59e6-kube-api-access-7q26h" (OuterVolumeSpecName: "kube-api-access-7q26h") pod "d5134b41-98a9-4555-9adb-c988862f59e6" (UID: "d5134b41-98a9-4555-9adb-c988862f59e6"). InnerVolumeSpecName "kube-api-access-7q26h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.679877 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd7300ea-27b0-438e-981c-7b862054e630-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.679927 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbkkp\" (UniqueName: \"kubernetes.io/projected/cd7300ea-27b0-438e-981c-7b862054e630-kube-api-access-sbkkp\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.679951 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q26h\" (UniqueName: \"kubernetes.io/projected/d5134b41-98a9-4555-9adb-c988862f59e6-kube-api-access-7q26h\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.679969 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5134b41-98a9-4555-9adb-c988862f59e6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.747762 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell08947-account-delete-x2bgc"] Nov 29 01:37:13 crc kubenswrapper[4749]: I1129 01:37:13.753983 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell08947-account-delete-x2bgc"] Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.421616 4749 generic.go:334] "Generic (PLEG): container finished" podID="8cb80122-cbde-418d-8f3f-367087068603" containerID="6ab8ab3317b880f167426788d0636c2f6f74ab5861bf7c160fed8a33923e7357" exitCode=137 Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.422032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi3ebb-account-delete-5jgft" event={"ID":"8cb80122-cbde-418d-8f3f-367087068603","Type":"ContainerDied","Data":"6ab8ab3317b880f167426788d0636c2f6f74ab5861bf7c160fed8a33923e7357"} Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.422074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi3ebb-account-delete-5jgft" event={"ID":"8cb80122-cbde-418d-8f3f-367087068603","Type":"ContainerDied","Data":"5f1ec69a9ae451e2eb713f1c052ad20d15584974edf36c0cf96f0a6f18f30ace"} Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.422097 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f1ec69a9ae451e2eb713f1c052ad20d15584974edf36c0cf96f0a6f18f30ace" Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.424593 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron5123-account-delete-qmfnv" Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.488238 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.519918 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron5123-account-delete-qmfnv"] Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.536297 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron5123-account-delete-qmfnv"] Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.604359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zh56\" (UniqueName: \"kubernetes.io/projected/8cb80122-cbde-418d-8f3f-367087068603-kube-api-access-8zh56\") pod \"8cb80122-cbde-418d-8f3f-367087068603\" (UID: \"8cb80122-cbde-418d-8f3f-367087068603\") " Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.604486 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts\") pod \"8cb80122-cbde-418d-8f3f-367087068603\" (UID: \"8cb80122-cbde-418d-8f3f-367087068603\") " Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.605542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8cb80122-cbde-418d-8f3f-367087068603" (UID: "8cb80122-cbde-418d-8f3f-367087068603"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.609209 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb80122-cbde-418d-8f3f-367087068603-kube-api-access-8zh56" (OuterVolumeSpecName: "kube-api-access-8zh56") pod "8cb80122-cbde-418d-8f3f-367087068603" (UID: "8cb80122-cbde-418d-8f3f-367087068603"). InnerVolumeSpecName "kube-api-access-8zh56". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.705927 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zh56\" (UniqueName: \"kubernetes.io/projected/8cb80122-cbde-418d-8f3f-367087068603-kube-api-access-8zh56\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:14 crc kubenswrapper[4749]: I1129 01:37:14.705968 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cb80122-cbde-418d-8f3f-367087068603-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 01:37:15 crc kubenswrapper[4749]: I1129 01:37:15.090027 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd7300ea-27b0-438e-981c-7b862054e630" path="/var/lib/kubelet/pods/cd7300ea-27b0-438e-981c-7b862054e630/volumes" Nov 29 01:37:15 crc kubenswrapper[4749]: I1129 01:37:15.091019 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5134b41-98a9-4555-9adb-c988862f59e6" path="/var/lib/kubelet/pods/d5134b41-98a9-4555-9adb-c988862f59e6/volumes" Nov 29 01:37:15 crc kubenswrapper[4749]: I1129 01:37:15.441991 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi3ebb-account-delete-5jgft" Nov 29 01:37:15 crc kubenswrapper[4749]: I1129 01:37:15.475772 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi3ebb-account-delete-5jgft"] Nov 29 01:37:15 crc kubenswrapper[4749]: I1129 01:37:15.487901 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi3ebb-account-delete-5jgft"] Nov 29 01:37:17 crc kubenswrapper[4749]: I1129 01:37:17.093763 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb80122-cbde-418d-8f3f-367087068603" path="/var/lib/kubelet/pods/8cb80122-cbde-418d-8f3f-367087068603/volumes" Nov 29 01:37:25 crc kubenswrapper[4749]: I1129 01:37:25.075102 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:37:25 crc kubenswrapper[4749]: E1129 01:37:25.076235 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:37:37 crc kubenswrapper[4749]: I1129 01:37:37.085264 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:37:37 crc kubenswrapper[4749]: E1129 01:37:37.086598 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:37:51 crc kubenswrapper[4749]: I1129 01:37:51.074894 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:37:51 crc kubenswrapper[4749]: E1129 01:37:51.075789 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:38:03 crc kubenswrapper[4749]: I1129 01:38:03.074934 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:38:03 crc kubenswrapper[4749]: E1129 01:38:03.075943 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.574559 4749 scope.go:117] "RemoveContainer" containerID="4443b1791cbae8653c74d15ac09fef4fc47d04d4da5dd53b897e42b9925d9a8e" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.624244 4749 scope.go:117] "RemoveContainer" containerID="c77043553ae219c84d2c309f4a2aa4114698fbead2ddaa374f410e948b3b3b37" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.667965 4749 scope.go:117] "RemoveContainer" containerID="d387c2987e2dc3792bc71d9ecfbf2e6a56b4c44b79c8cc37de7229402aa58215" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.710402 4749 scope.go:117] "RemoveContainer" containerID="3dc57790e819b5c2fd7d24841a617fcfe2253270800c29923a0f8a3796f1a9a5" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.742958 4749 scope.go:117] "RemoveContainer" containerID="2aef9b804a300d8b097b904359b47ca8d68cf5bd5cb50f9e4a509ec03501f312" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.772716 4749 scope.go:117] "RemoveContainer" containerID="d8dc0505ade6c3a7b001fd49ab3118193dd6af3d47615b2a7ee7ceba6fef3ffc" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.801728 4749 scope.go:117] "RemoveContainer" containerID="264f3785ab82460c75c592ee847ad04af28a33b846be9d53ed84791d7f08ac24" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.831842 4749 scope.go:117] "RemoveContainer" containerID="f7ced13b7586036aaf8e13a7f09182c81b2e554a8054b19fe03c90cd5ca2e42b" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.852260 4749 scope.go:117] "RemoveContainer" containerID="1e24c52665f61a339ea805c9382b78f4544ad14785f2be7457cd2e7fb68f1536" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.889394 4749 scope.go:117] "RemoveContainer" containerID="11f7ead3ae7a578e0a2ea81d3114b0e794f04b33101b661640147fb015d3433e" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.911450 4749 scope.go:117] "RemoveContainer" containerID="99755c868763f972128f21ca651af421ac1e461e7cfbe7a329adf4cc6dd1628d" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.949404 4749 scope.go:117] "RemoveContainer" containerID="877b8b5eb182d8827c6031d04c1c5302137ddc4813ede6360099d452ed4c9288" Nov 29 01:38:08 crc kubenswrapper[4749]: I1129 01:38:08.976351 4749 scope.go:117] "RemoveContainer" containerID="35029c3554395ea9426e60b18014966674f056157d89d9bbcc57908c5a3991cc" Nov 29 01:38:09 crc kubenswrapper[4749]: I1129 01:38:09.007869 4749 scope.go:117] "RemoveContainer" containerID="d0569384a350f4295e192fcee3f98b3e3e7b0e1e279219472a2ca0815c60cdf2" Nov 29 01:38:15 crc kubenswrapper[4749]: I1129 01:38:15.075019 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:38:15 crc kubenswrapper[4749]: E1129 01:38:15.075658 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:38:28 crc kubenswrapper[4749]: I1129 01:38:28.076634 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:38:28 crc kubenswrapper[4749]: E1129 01:38:28.077830 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:38:40 crc kubenswrapper[4749]: I1129 01:38:40.075279 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:38:40 crc kubenswrapper[4749]: E1129 01:38:40.077122 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:38:53 crc kubenswrapper[4749]: I1129 01:38:53.075447 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:38:53 crc kubenswrapper[4749]: E1129 01:38:53.076429 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:39:04 crc kubenswrapper[4749]: I1129 01:39:04.075662 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:39:04 crc kubenswrapper[4749]: E1129 01:39:04.076627 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.321330 4749 scope.go:117] "RemoveContainer" containerID="1c3f1dfce598ef9b86d1af6c3f250556b2470fb0d92e58c0d661d554b6eca798" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.350666 4749 scope.go:117] "RemoveContainer" containerID="8bde49f653c7a890a3f11428b5610e77b4cf7e9835e8547b818296896403a5aa" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.378993 4749 scope.go:117] "RemoveContainer" containerID="79eea270e39e7f4cc2a391113d6a21f6ec7ca132b30b00de81c6ba37e4fee5f5" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.423794 4749 scope.go:117] "RemoveContainer" containerID="e5f55d830c3b9beb232739425c6d8c977604c4ed1e22fec74f43465868d1c470" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.478760 4749 scope.go:117] "RemoveContainer" containerID="63710a4616bf5617142071918ea3fd7b7d08dccb9a14eb02d7c475c5731d8bc3" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.504911 4749 scope.go:117] "RemoveContainer" containerID="0f0213b5ccee6839849d82e20a5d7ee9aa22319de6297d00f42c6ec878d4a759" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.550461 4749 scope.go:117] "RemoveContainer" containerID="aafe1964fd0d38ca73b43970e806be9da911509377abf07f7b6364eee4b52252" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.593334 4749 scope.go:117] "RemoveContainer" containerID="f861eef455efe7c4d77d157775bb326fa3614266dfd5ecbfd8147c59729d017f" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.612920 4749 scope.go:117] "RemoveContainer" containerID="3e486a43c1b894bc08428cc6a8f5fcc03bc72bb1a25379d9061ae2fe460693e9" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.636665 4749 scope.go:117] "RemoveContainer" containerID="a8c8a713e75709f52b35c805603adbd78c58b5ef7e39c57d1ccf8bbf1b1524a2" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.663338 4749 scope.go:117] "RemoveContainer" containerID="99d208eb4c50e726cc807bb0e460337bfaea3b2f788b14b09f5f549cd368568d" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.696536 4749 scope.go:117] "RemoveContainer" containerID="9f887fa6c5c95bfb17f5897d64af04248c64f3261379834f006b8fcb67264158" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.730433 4749 scope.go:117] "RemoveContainer" containerID="fb2620ea85039940ac22ecc63eb62adbadd8cd8f12ac051b31c46aa498066f54" Nov 29 01:39:09 crc kubenswrapper[4749]: I1129 01:39:09.760070 4749 scope.go:117] "RemoveContainer" containerID="f7d18dd171d151bb5f7f072cb26eda7543c6d923103ab1c8ac9ee90af97d0e8b" Nov 29 01:39:16 crc kubenswrapper[4749]: I1129 01:39:16.074919 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:39:16 crc kubenswrapper[4749]: E1129 01:39:16.075873 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:39:31 crc kubenswrapper[4749]: I1129 01:39:31.078328 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:39:31 crc kubenswrapper[4749]: E1129 01:39:31.079611 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:39:42 crc kubenswrapper[4749]: I1129 01:39:42.074843 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:39:42 crc kubenswrapper[4749]: E1129 01:39:42.075872 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:39:54 crc kubenswrapper[4749]: I1129 01:39:54.075456 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:39:54 crc kubenswrapper[4749]: E1129 01:39:54.076447 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:40:07 crc kubenswrapper[4749]: I1129 01:40:07.083548 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:40:07 crc kubenswrapper[4749]: E1129 01:40:07.084786 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.038614 4749 scope.go:117] "RemoveContainer" containerID="23c1f9a9695d62882d05b68f2a362774fd211e09dcdb543ab2bb762d27b58e29" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.070529 4749 scope.go:117] "RemoveContainer" containerID="cd4b2d7fe24bdd93d10fb596919ae826b0504292c7e202f61e2a5d2d20edfc19" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.101142 4749 scope.go:117] "RemoveContainer" containerID="ee0799db0869202f475d230e6942e96a55275b4ddc42e2b4b308c9e58f27ff1f" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.136476 4749 scope.go:117] "RemoveContainer" containerID="eced19490e815516b429f396da09f8236d99a0db8fbad5ff60fe95e770b29786" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.187461 4749 scope.go:117] "RemoveContainer" containerID="02683ddd55e0d8639114410723c72fa11f946bafbeb6b895d2b31282c1143425" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.216271 4749 scope.go:117] "RemoveContainer" containerID="89b878f4648a1b7a255f4ce926f9a55d53da0950d5f663bde6352f702c375d11" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.273835 4749 scope.go:117] "RemoveContainer" containerID="0957fdee810eb3eb8c31f26782750e24fed0ee07cdb3eb0ee23af55d2e009a5c" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.304361 4749 scope.go:117] "RemoveContainer" containerID="8d561778eb4b907693839e5fa61e32e52ffe73716ac1b9c6c7bd406db4403383" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.325441 4749 scope.go:117] "RemoveContainer" containerID="5ed421f49bc336804d73a5ee8923bd091bdb89ba7051efb3930f5c62c2dfef03" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.363379 4749 scope.go:117] "RemoveContainer" containerID="1b79b846ba48ed32c135a2f846a3f7529ae64e566dfcbb87a5a76c965c461763" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.385903 4749 scope.go:117] "RemoveContainer" containerID="1d60feb658afa56447686bacf5087fdb32435f3e030225e481d08f3b86fab8ec" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.414632 4749 scope.go:117] "RemoveContainer" containerID="a7824ea4a1031c1f65c49977bfaf87c386308a4ee677e25e7488e91b8b1a8925" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.435917 4749 scope.go:117] "RemoveContainer" containerID="e944097794538d3c014641110ebd81f38c1dcad22ea2f4aa4c69b522c4b836ee" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.463905 4749 scope.go:117] "RemoveContainer" containerID="ac6b10cbf89933157f87ce0832e498f620cc616f9e2d34fbbac8faa2f0a1cdbd" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.483912 4749 scope.go:117] "RemoveContainer" containerID="e2aab48732b202c3e72e85886215cd2e441185bbc74a2b999dc3bff8161959f4" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.519553 4749 scope.go:117] "RemoveContainer" containerID="ff6a195164c1a51091d5571bc5e7947391e0cf746aff0edfbb3435b23fa14721" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.543173 4749 scope.go:117] "RemoveContainer" containerID="f743ba6ff8b1da63b560049f4030c6a63de306ec90e8671821fcc72e7725c52a" Nov 29 01:40:10 crc kubenswrapper[4749]: I1129 01:40:10.571152 4749 scope.go:117] "RemoveContainer" containerID="a5eb98b5651d00f847dc7baf4de09dc7755d80312cb6155f7d2d577d49c2b8e1" Nov 29 01:40:21 crc kubenswrapper[4749]: I1129 01:40:21.076175 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:40:21 crc kubenswrapper[4749]: E1129 01:40:21.077487 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:40:36 crc kubenswrapper[4749]: I1129 01:40:36.077141 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:40:36 crc kubenswrapper[4749]: E1129 01:40:36.080497 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:40:48 crc kubenswrapper[4749]: I1129 01:40:48.075901 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:40:48 crc kubenswrapper[4749]: E1129 01:40:48.077210 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:41:00 crc kubenswrapper[4749]: I1129 01:41:00.074999 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:41:00 crc kubenswrapper[4749]: E1129 01:41:00.076460 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:41:10 crc kubenswrapper[4749]: I1129 01:41:10.855730 4749 scope.go:117] "RemoveContainer" containerID="60d404c089766276e2b776b298e5f3050db4dd04153458e59685a427fe248694" Nov 29 01:41:10 crc kubenswrapper[4749]: I1129 01:41:10.889990 4749 scope.go:117] "RemoveContainer" containerID="fe391866b222bdc27b02c092a42376ca48bf580b3de4ed28527aad1c47cfa12b" Nov 29 01:41:10 crc kubenswrapper[4749]: I1129 01:41:10.913895 4749 scope.go:117] "RemoveContainer" containerID="5144de4befe6c5a404315ac3261b56ae92db97d58334720fc31752c8b0925912" Nov 29 01:41:10 crc kubenswrapper[4749]: I1129 01:41:10.949145 4749 scope.go:117] "RemoveContainer" containerID="057b70f2300d8a9d4f6146699c4726942e6bf05efd91b5b8e7130fb0cdcd5547" Nov 29 01:41:11 crc kubenswrapper[4749]: I1129 01:41:11.012851 4749 scope.go:117] "RemoveContainer" containerID="b451af15ca02cf0e5e8a766761169627777887698c277c719a9b70488450c040" Nov 29 01:41:11 crc kubenswrapper[4749]: I1129 01:41:11.045514 4749 scope.go:117] "RemoveContainer" containerID="81a360b2af0a80524dd327e6ff413357d3ae93289f88811a965e85a9ada36073" Nov 29 01:41:11 crc kubenswrapper[4749]: I1129 01:41:11.073043 4749 scope.go:117] "RemoveContainer" containerID="7334ced92ad7e7487e07d11477e12388455a9967e089ebd9586c29cd48fd9002" Nov 29 01:41:11 crc kubenswrapper[4749]: I1129 01:41:11.075272 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:41:11 crc kubenswrapper[4749]: E1129 01:41:11.075752 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:41:11 crc kubenswrapper[4749]: I1129 01:41:11.097861 4749 scope.go:117] "RemoveContainer" containerID="2b52f160e991611b4fa569384437e823fb44de2a911736414952d94b4cc7eca3" Nov 29 01:41:11 crc kubenswrapper[4749]: I1129 01:41:11.130037 4749 scope.go:117] "RemoveContainer" containerID="82990d2dcf0c8a627fe2e51a0ec691a173345be20d660bbf2193506a1de812d5" Nov 29 01:41:11 crc kubenswrapper[4749]: I1129 01:41:11.195575 4749 scope.go:117] "RemoveContainer" containerID="3921629adb03e7df47ffbd7236135a367c35c210410b3590dffa731dd7125961" Nov 29 01:41:25 crc kubenswrapper[4749]: I1129 01:41:25.076666 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:41:25 crc kubenswrapper[4749]: E1129 01:41:25.077609 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:41:37 crc kubenswrapper[4749]: I1129 01:41:37.083371 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:41:37 crc kubenswrapper[4749]: I1129 01:41:37.456275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"08ef43bb07b3609728cc9d9990db3acd5c68f06e8ff6480babb6cfffdb449dc5"} Nov 29 01:42:11 crc kubenswrapper[4749]: I1129 01:42:11.331906 4749 scope.go:117] "RemoveContainer" containerID="5fea5dc22f78f4b44307aafccca2e0586ca8c1965943165df879437b824209c1" Nov 29 01:42:11 crc kubenswrapper[4749]: I1129 01:42:11.366106 4749 scope.go:117] "RemoveContainer" containerID="71b2e244af901dca544d387980cde284750449022d8bbea0c7e5b1dfa19f0e59" Nov 29 01:42:11 crc kubenswrapper[4749]: I1129 01:42:11.413986 4749 scope.go:117] "RemoveContainer" containerID="9eed71d2ea4b0dda3706b093b511a6005492203beb6c1a0d7d09b12729581237" Nov 29 01:42:11 crc kubenswrapper[4749]: I1129 01:42:11.448374 4749 scope.go:117] "RemoveContainer" containerID="7ed486eaa05c6c64b3403cd46e5913c80fe7c998f3c94c72b271939918850910" Nov 29 01:42:11 crc kubenswrapper[4749]: I1129 01:42:11.466833 4749 scope.go:117] "RemoveContainer" containerID="a92e13de7e8f2651d46dd244243eee9c0810da21b48c9f85f4c7ffacbccd381e" Nov 29 01:42:11 crc kubenswrapper[4749]: I1129 01:42:11.490815 4749 scope.go:117] "RemoveContainer" containerID="07a474eb6ac2724e1b0cdecbb5da046a4117a4bce874254d62a90db087962686" Nov 29 01:42:11 crc kubenswrapper[4749]: I1129 01:42:11.515271 4749 scope.go:117] "RemoveContainer" containerID="b6161f4f801200f63a69f8699a09c9b84f3a1ded5a4c508dc41de08731035fec" Nov 29 01:43:11 crc kubenswrapper[4749]: I1129 01:43:11.627007 4749 scope.go:117] "RemoveContainer" containerID="3baf52548ae23fe4dec7847cae0b2d3819b57cfb6e4936dd7cf1ef54cfe2c8b3" Nov 29 01:43:11 crc kubenswrapper[4749]: I1129 01:43:11.657264 4749 scope.go:117] "RemoveContainer" containerID="286081c662919600ce871444f21fff6de8afe86aaa6b772098323d22bdea23b5" Nov 29 01:43:11 crc kubenswrapper[4749]: I1129 01:43:11.685825 4749 scope.go:117] "RemoveContainer" containerID="6ab8ab3317b880f167426788d0636c2f6f74ab5861bf7c160fed8a33923e7357" Nov 29 01:43:11 crc kubenswrapper[4749]: I1129 01:43:11.718638 4749 scope.go:117] "RemoveContainer" containerID="66d79493b0a5b5a1671ff38a71caa57e6969e56432f8df85f19ce7f9774d8fbe" Nov 29 01:43:55 crc kubenswrapper[4749]: I1129 01:43:55.373905 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:43:55 crc kubenswrapper[4749]: I1129 01:43:55.374515 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:44:25 crc kubenswrapper[4749]: I1129 01:44:25.374192 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:44:25 crc kubenswrapper[4749]: I1129 01:44:25.374837 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:44:55 crc kubenswrapper[4749]: I1129 01:44:55.375080 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:44:55 crc kubenswrapper[4749]: I1129 01:44:55.375763 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:44:55 crc kubenswrapper[4749]: I1129 01:44:55.375834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:44:55 crc kubenswrapper[4749]: I1129 01:44:55.376906 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08ef43bb07b3609728cc9d9990db3acd5c68f06e8ff6480babb6cfffdb449dc5"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:44:55 crc kubenswrapper[4749]: I1129 01:44:55.377010 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://08ef43bb07b3609728cc9d9990db3acd5c68f06e8ff6480babb6cfffdb449dc5" gracePeriod=600 Nov 29 01:44:56 crc kubenswrapper[4749]: I1129 01:44:56.388345 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="08ef43bb07b3609728cc9d9990db3acd5c68f06e8ff6480babb6cfffdb449dc5" exitCode=0 Nov 29 01:44:56 crc kubenswrapper[4749]: I1129 01:44:56.388471 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"08ef43bb07b3609728cc9d9990db3acd5c68f06e8ff6480babb6cfffdb449dc5"} Nov 29 01:44:56 crc kubenswrapper[4749]: I1129 01:44:56.388796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89"} Nov 29 01:44:56 crc kubenswrapper[4749]: I1129 01:44:56.388839 4749 scope.go:117] "RemoveContainer" containerID="6ebe8a023d9a41ca086975e4761d0435804a6bd2e7872d7a4a32b605cce7aac0" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.163819 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9"] Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164726 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerName="probe" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164744 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerName="probe" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164763 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerName="barbican-keystone-listener" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164771 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerName="barbican-keystone-listener" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164780 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerName="barbican-worker-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164788 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerName="barbican-worker-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164801 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164810 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164823 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49a0e83-3d6d-47e1-8ca4-4cae46d23168" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164831 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49a0e83-3d6d-47e1-8ca4-4cae46d23168" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164844 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164852 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164867 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-reaper" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164875 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-reaper" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164887 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164894 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164902 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164909 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164918 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="proxy-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164925 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="proxy-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164936 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5134b41-98a9-4555-9adb-c988862f59e6" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164943 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5134b41-98a9-4555-9adb-c988862f59e6" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164958 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="ceilometer-central-agent" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164966 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="ceilometer-central-agent" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164974 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.164981 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.164994 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb80122-cbde-418d-8f3f-367087068603" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165001 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb80122-cbde-418d-8f3f-367087068603" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165017 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165024 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165038 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" containerName="mysql-bootstrap" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165045 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" containerName="mysql-bootstrap" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165058 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165065 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165077 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a342ef2-ab29-4277-a7b4-0f83e5c3ca21" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165084 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a342ef2-ab29-4277-a7b4-0f83e5c3ca21" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165098 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165105 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-server" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165115 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" containerName="rabbitmq" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165122 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" containerName="rabbitmq" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165131 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerName="galera" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165137 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerName="galera" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165152 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-metadata" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165159 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-metadata" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165174 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server-init" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165182 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server-init" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165190 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a220429-0482-4578-9268-d4127f8da9af" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165217 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a220429-0482-4578-9268-d4127f8da9af" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165228 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6565857b-6329-46a9-b8c0-4cad1019c4b9" containerName="dnsmasq-dns" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165235 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6565857b-6329-46a9-b8c0-4cad1019c4b9" containerName="dnsmasq-dns" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165246 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165252 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165263 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-updater" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165270 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-updater" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165281 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-updater" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165288 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-updater" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165296 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9cf96c-6bdd-425d-8983-4bfa2250edda" containerName="kube-state-metrics" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165303 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9cf96c-6bdd-425d-8983-4bfa2250edda" containerName="kube-state-metrics" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165313 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165320 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165330 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165336 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165349 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a4b5e6-f82a-4316-a7e8-d596136086c2" containerName="nova-cell0-conductor-conductor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165356 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a4b5e6-f82a-4316-a7e8-d596136086c2" containerName="nova-cell0-conductor-conductor" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165366 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165374 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165384 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d619b935-7717-4e88-af76-97e946d3cef5" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165391 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d619b935-7717-4e88-af76-97e946d3cef5" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165400 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823cce34-3656-4edf-9197-5586262263ec" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165408 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="823cce34-3656-4edf-9197-5586262263ec" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165420 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6565857b-6329-46a9-b8c0-4cad1019c4b9" containerName="init" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165426 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6565857b-6329-46a9-b8c0-4cad1019c4b9" containerName="init" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165435 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="sg-core" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165442 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="sg-core" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165452 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165460 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165470 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165476 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165487 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerName="barbican-worker" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165493 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerName="barbican-worker" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165505 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc6d8b8-4291-4f54-8bb2-508933b39c5a" containerName="nova-cell1-conductor-conductor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165513 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc6d8b8-4291-4f54-8bb2-508933b39c5a" containerName="nova-cell1-conductor-conductor" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165521 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165528 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165538 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28df216a-4f1e-449f-aaf6-45fd12929ad8" containerName="keystone-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165546 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="28df216a-4f1e-449f-aaf6-45fd12929ad8" containerName="keystone-api" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165557 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165563 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165574 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" containerName="galera" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165605 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" containerName="galera" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165616 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="rsync" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165622 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="rsync" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165631 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-expirer" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165638 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-expirer" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165647 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerName="ovsdbserver-sb" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165654 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerName="ovsdbserver-sb" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165662 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerName="cinder-scheduler" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165670 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerName="cinder-scheduler" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165679 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65fd8520-689b-4f93-850e-bac0cec97025" containerName="ovn-controller" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165686 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="65fd8520-689b-4f93-850e-bac0cec97025" containerName="ovn-controller" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165696 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165705 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165716 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165724 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165736 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165743 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165756 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165762 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165774 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" containerName="proxy-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165782 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" containerName="proxy-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165793 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7300ea-27b0-438e-981c-7b862054e630" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165802 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7300ea-27b0-438e-981c-7b862054e630" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165815 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165824 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-api" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165835 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="132885d9-92b8-4cf2-b8e2-7f365fd0d020" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165845 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="132885d9-92b8-4cf2-b8e2-7f365fd0d020" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165857 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerName="glance-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165867 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerName="glance-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165883 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165892 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-server" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165902 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165911 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-api" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165937 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" containerName="setup-container" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165946 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" containerName="setup-container" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165958 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="ovn-northd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165967 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="ovn-northd" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165977 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerName="mysql-bootstrap" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.165985 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerName="mysql-bootstrap" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.165997 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3962a4be-25eb-45f6-8b1a-f84341319df3" containerName="memcached" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166004 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3962a4be-25eb-45f6-8b1a-f84341319df3" containerName="memcached" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166016 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" containerName="setup-container" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166024 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" containerName="setup-container" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166037 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e2ea248-43ad-4851-96ea-3e6adaba3ef0" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166046 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e2ea248-43ad-4851-96ea-3e6adaba3ef0" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166058 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" containerName="rabbitmq" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166067 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" containerName="rabbitmq" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166078 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166085 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-api" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166096 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerName="barbican-keystone-listener-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166104 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerName="barbican-keystone-listener-log" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166113 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166121 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-server" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166130 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166137 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166149 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="ceilometer-notification-agent" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166156 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="ceilometer-notification-agent" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166168 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" containerName="proxy-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166566 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" containerName="proxy-server" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166583 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d619b935-7717-4e88-af76-97e946d3cef5" containerName="ovsdbserver-nb" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166592 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d619b935-7717-4e88-af76-97e946d3cef5" containerName="ovsdbserver-nb" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166603 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerName="glance-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerName="glance-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166634 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="swift-recon-cron" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166641 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="swift-recon-cron" Nov 29 01:45:00 crc kubenswrapper[4749]: E1129 01:45:00.166654 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5792ec0-0d00-47e7-8d9d-d3133cd1e695" containerName="nova-scheduler-scheduler" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166661 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5792ec0-0d00-47e7-8d9d-d3133cd1e695" containerName="nova-scheduler-scheduler" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166820 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc6d8b8-4291-4f54-8bb2-508933b39c5a" containerName="nova-cell1-conductor-conductor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166839 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerName="barbican-worker-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166855 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49a0e83-3d6d-47e1-8ca4-4cae46d23168" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166866 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166873 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166882 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166893 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166905 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166912 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a342ef2-ab29-4277-a7b4-0f83e5c3ca21" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166922 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6565857b-6329-46a9-b8c0-4cad1019c4b9" containerName="dnsmasq-dns" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.166933 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="823cce34-3656-4edf-9197-5586262263ec" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167569 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167584 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerName="probe" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167592 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d619b935-7717-4e88-af76-97e946d3cef5" containerName="ovsdbserver-nb" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167602 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovs-vswitchd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167614 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="rsync" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167625 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7300ea-27b0-438e-981c-7b862054e630" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167637 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="ceilometer-central-agent" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167649 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167659 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9603fe-72d8-479a-86be-9b914455fba1" containerName="rabbitmq" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167671 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-expirer" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167682 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167691 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="ovn-northd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167698 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-reaper" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167710 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167721 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c639d859-841e-4f38-a2b3-09fc3201e616" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167732 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e422f911-d2a1-48ac-9ad7-9394647ad23c" containerName="barbican-api-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167741 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f827fe-55e8-4f5d-a074-bd79f9029382" containerName="ovsdb-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167751 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05059ec-0cc5-4873-8041-bb14c2fa4c53" containerName="galera" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167762 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb80122-cbde-418d-8f3f-367087068603" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167774 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerName="glance-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167782 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerName="barbican-keystone-listener" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167792 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="04501dca-4c62-4065-abb5-fbdfb9ce76fc" containerName="neutron-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167804 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" containerName="proxy-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167814 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="76910f08-d491-4b48-9439-78baad6ac3d3" containerName="proxy-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167826 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerName="ovsdbserver-sb" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167837 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b929be5-bc3e-47c6-8ac4-0ada6d740a7f" containerName="cinder-scheduler" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167851 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5906d408-1c10-4c55-a07b-f94d302a08c6" containerName="galera" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167859 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a087f05-8b7d-4207-88e8-1c622d57c653" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167871 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-auditor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167880 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d17ec44-0d41-4ec3-bb2d-e89b0f6f14b8" containerName="glance-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167889 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167896 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3ce27f-fb0b-45a9-99e8-e6e98c5a17ee" containerName="placement-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167906 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="132885d9-92b8-4cf2-b8e2-7f365fd0d020" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167916 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8715ecba-f08b-4fcd-b129-d9e9c568e087" containerName="cinder-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167923 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5134b41-98a9-4555-9adb-c988862f59e6" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167934 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-metadata" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167945 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e2ea248-43ad-4851-96ea-3e6adaba3ef0" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167952 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f9e6e2-7ed0-4977-a558-27fb6a9d4001" containerName="nova-metadata-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167964 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="swift-recon-cron" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167970 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a4b5e6-f82a-4316-a7e8-d596136086c2" containerName="nova-cell0-conductor-conductor" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167983 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="65fd8520-689b-4f93-850e-bac0cec97025" containerName="ovn-controller" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.167995 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91885d5-497a-41e4-9796-ca25f184b178" containerName="nova-api-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168005 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9cf96c-6bdd-425d-8983-4bfa2250edda" containerName="kube-state-metrics" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168014 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d65a6e-107e-4ffe-8561-49b1dd2e9aba" containerName="barbican-keystone-listener-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168024 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-updater" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168034 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="28df216a-4f1e-449f-aaf6-45fd12929ad8" containerName="keystone-api" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168043 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a44203-fd94-4eb4-952f-d54a5c577095" containerName="rabbitmq" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168054 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d619b935-7717-4e88-af76-97e946d3cef5" containerName="openstack-network-exporter" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168062 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168075 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff73098b-8f03-43ac-9e1d-3ac7edd2589d" containerName="glance-log" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168085 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3962a4be-25eb-45f6-8b1a-f84341319df3" containerName="memcached" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168095 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-replicator" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168106 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="proxy-httpd" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168116 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="sg-core" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168123 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="container-updater" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168133 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db8c8b1-f222-46a2-9a4a-e9e48f27802c" containerName="ceilometer-notification-agent" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168142 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5792ec0-0d00-47e7-8d9d-d3133cd1e695" containerName="nova-scheduler-scheduler" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168151 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="account-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168164 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74c0b83-feac-4f92-b5cf-1ec7c9e298a4" containerName="barbican-worker" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168175 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faf9cff-1e0a-4d87-b75e-8899450678a4" containerName="object-server" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168182 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a220429-0482-4578-9268-d4127f8da9af" containerName="mariadb-account-delete" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.168767 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.172036 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.173936 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.177613 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9"] Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.322037 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4960404c-fe87-49b2-8574-c6f268509899-config-volume\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.322077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4960404c-fe87-49b2-8574-c6f268509899-secret-volume\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.322116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxmtr\" (UniqueName: \"kubernetes.io/projected/4960404c-fe87-49b2-8574-c6f268509899-kube-api-access-qxmtr\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.423761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4960404c-fe87-49b2-8574-c6f268509899-config-volume\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.423818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4960404c-fe87-49b2-8574-c6f268509899-secret-volume\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.423873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxmtr\" (UniqueName: \"kubernetes.io/projected/4960404c-fe87-49b2-8574-c6f268509899-kube-api-access-qxmtr\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.428538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4960404c-fe87-49b2-8574-c6f268509899-config-volume\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.433750 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4960404c-fe87-49b2-8574-c6f268509899-secret-volume\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.454492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxmtr\" (UniqueName: \"kubernetes.io/projected/4960404c-fe87-49b2-8574-c6f268509899-kube-api-access-qxmtr\") pod \"collect-profiles-29406345-kf6j9\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.491504 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:00 crc kubenswrapper[4749]: I1129 01:45:00.764463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9"] Nov 29 01:45:01 crc kubenswrapper[4749]: I1129 01:45:01.441898 4749 generic.go:334] "Generic (PLEG): container finished" podID="4960404c-fe87-49b2-8574-c6f268509899" containerID="77a978578c0c8b13e73bbf481b17d6950413a06b2fa1b645983728425088d52d" exitCode=0 Nov 29 01:45:01 crc kubenswrapper[4749]: I1129 01:45:01.442187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" event={"ID":"4960404c-fe87-49b2-8574-c6f268509899","Type":"ContainerDied","Data":"77a978578c0c8b13e73bbf481b17d6950413a06b2fa1b645983728425088d52d"} Nov 29 01:45:01 crc kubenswrapper[4749]: I1129 01:45:01.442229 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" event={"ID":"4960404c-fe87-49b2-8574-c6f268509899","Type":"ContainerStarted","Data":"1dfb42ce0117e1db9c0754ab5dbb09f82a7727dca605b5a2ed10addc0a06fa33"} Nov 29 01:45:02 crc kubenswrapper[4749]: I1129 01:45:02.855375 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:02 crc kubenswrapper[4749]: I1129 01:45:02.963695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxmtr\" (UniqueName: \"kubernetes.io/projected/4960404c-fe87-49b2-8574-c6f268509899-kube-api-access-qxmtr\") pod \"4960404c-fe87-49b2-8574-c6f268509899\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " Nov 29 01:45:02 crc kubenswrapper[4749]: I1129 01:45:02.963781 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4960404c-fe87-49b2-8574-c6f268509899-config-volume\") pod \"4960404c-fe87-49b2-8574-c6f268509899\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " Nov 29 01:45:02 crc kubenswrapper[4749]: I1129 01:45:02.963917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4960404c-fe87-49b2-8574-c6f268509899-secret-volume\") pod \"4960404c-fe87-49b2-8574-c6f268509899\" (UID: \"4960404c-fe87-49b2-8574-c6f268509899\") " Nov 29 01:45:02 crc kubenswrapper[4749]: I1129 01:45:02.964510 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4960404c-fe87-49b2-8574-c6f268509899-config-volume" (OuterVolumeSpecName: "config-volume") pod "4960404c-fe87-49b2-8574-c6f268509899" (UID: "4960404c-fe87-49b2-8574-c6f268509899"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 01:45:02 crc kubenswrapper[4749]: I1129 01:45:02.971471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4960404c-fe87-49b2-8574-c6f268509899-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4960404c-fe87-49b2-8574-c6f268509899" (UID: "4960404c-fe87-49b2-8574-c6f268509899"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 01:45:02 crc kubenswrapper[4749]: I1129 01:45:02.972444 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4960404c-fe87-49b2-8574-c6f268509899-kube-api-access-qxmtr" (OuterVolumeSpecName: "kube-api-access-qxmtr") pod "4960404c-fe87-49b2-8574-c6f268509899" (UID: "4960404c-fe87-49b2-8574-c6f268509899"). InnerVolumeSpecName "kube-api-access-qxmtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:45:03 crc kubenswrapper[4749]: I1129 01:45:03.066401 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxmtr\" (UniqueName: \"kubernetes.io/projected/4960404c-fe87-49b2-8574-c6f268509899-kube-api-access-qxmtr\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:03 crc kubenswrapper[4749]: I1129 01:45:03.066470 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4960404c-fe87-49b2-8574-c6f268509899-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:03 crc kubenswrapper[4749]: I1129 01:45:03.066490 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4960404c-fe87-49b2-8574-c6f268509899-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:03 crc kubenswrapper[4749]: I1129 01:45:03.465076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" event={"ID":"4960404c-fe87-49b2-8574-c6f268509899","Type":"ContainerDied","Data":"1dfb42ce0117e1db9c0754ab5dbb09f82a7727dca605b5a2ed10addc0a06fa33"} Nov 29 01:45:03 crc kubenswrapper[4749]: I1129 01:45:03.465167 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9" Nov 29 01:45:03 crc kubenswrapper[4749]: I1129 01:45:03.465392 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfb42ce0117e1db9c0754ab5dbb09f82a7727dca605b5a2ed10addc0a06fa33" Nov 29 01:45:03 crc kubenswrapper[4749]: I1129 01:45:03.941888 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw"] Nov 29 01:45:03 crc kubenswrapper[4749]: I1129 01:45:03.949242 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406300-2xlmw"] Nov 29 01:45:05 crc kubenswrapper[4749]: I1129 01:45:05.089302 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf44185-40f9-4d15-93a0-13604622e06f" path="/var/lib/kubelet/pods/2cf44185-40f9-4d15-93a0-13604622e06f/volumes" Nov 29 01:45:11 crc kubenswrapper[4749]: I1129 01:45:11.853299 4749 scope.go:117] "RemoveContainer" containerID="f22f79437c777e2c389216011056a2d471465b655bdb3c16cfd4cd502d6055c6" Nov 29 01:45:18 crc kubenswrapper[4749]: I1129 01:45:18.984367 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xgwpj"] Nov 29 01:45:18 crc kubenswrapper[4749]: E1129 01:45:18.985444 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4960404c-fe87-49b2-8574-c6f268509899" containerName="collect-profiles" Nov 29 01:45:18 crc kubenswrapper[4749]: I1129 01:45:18.985466 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4960404c-fe87-49b2-8574-c6f268509899" containerName="collect-profiles" Nov 29 01:45:18 crc kubenswrapper[4749]: I1129 01:45:18.985746 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4960404c-fe87-49b2-8574-c6f268509899" containerName="collect-profiles" Nov 29 01:45:18 crc kubenswrapper[4749]: I1129 01:45:18.987554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.021383 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgwpj"] Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.158931 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-catalog-content\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.159045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-utilities\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.159564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4jrl\" (UniqueName: \"kubernetes.io/projected/bad68330-6939-48f7-b549-91cd1f650f00-kube-api-access-t4jrl\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.261359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4jrl\" (UniqueName: \"kubernetes.io/projected/bad68330-6939-48f7-b549-91cd1f650f00-kube-api-access-t4jrl\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.261438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-catalog-content\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.261464 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-utilities\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.263098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-catalog-content\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.263419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-utilities\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.296901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4jrl\" (UniqueName: \"kubernetes.io/projected/bad68330-6939-48f7-b549-91cd1f650f00-kube-api-access-t4jrl\") pod \"community-operators-xgwpj\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.324797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.581026 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgwpj"] Nov 29 01:45:19 crc kubenswrapper[4749]: I1129 01:45:19.646266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgwpj" event={"ID":"bad68330-6939-48f7-b549-91cd1f650f00","Type":"ContainerStarted","Data":"337fbcbd2c8f4e54c4eaa5e7748326d93a31f657d797dd3bd9aa0fdd91fc6c96"} Nov 29 01:45:20 crc kubenswrapper[4749]: I1129 01:45:20.659442 4749 generic.go:334] "Generic (PLEG): container finished" podID="bad68330-6939-48f7-b549-91cd1f650f00" containerID="2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806" exitCode=0 Nov 29 01:45:20 crc kubenswrapper[4749]: I1129 01:45:20.659523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgwpj" event={"ID":"bad68330-6939-48f7-b549-91cd1f650f00","Type":"ContainerDied","Data":"2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806"} Nov 29 01:45:20 crc kubenswrapper[4749]: I1129 01:45:20.662159 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 01:45:21 crc kubenswrapper[4749]: I1129 01:45:21.673170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgwpj" event={"ID":"bad68330-6939-48f7-b549-91cd1f650f00","Type":"ContainerStarted","Data":"90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e"} Nov 29 01:45:22 crc kubenswrapper[4749]: I1129 01:45:22.685574 4749 generic.go:334] "Generic (PLEG): container finished" podID="bad68330-6939-48f7-b549-91cd1f650f00" containerID="90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e" exitCode=0 Nov 29 01:45:22 crc kubenswrapper[4749]: I1129 01:45:22.685704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgwpj" event={"ID":"bad68330-6939-48f7-b549-91cd1f650f00","Type":"ContainerDied","Data":"90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e"} Nov 29 01:45:23 crc kubenswrapper[4749]: I1129 01:45:23.698441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgwpj" event={"ID":"bad68330-6939-48f7-b549-91cd1f650f00","Type":"ContainerStarted","Data":"9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1"} Nov 29 01:45:23 crc kubenswrapper[4749]: I1129 01:45:23.721847 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xgwpj" podStartSLOduration=3.017421418 podStartE2EDuration="5.721826418s" podCreationTimestamp="2025-11-29 01:45:18 +0000 UTC" firstStartedPulling="2025-11-29 01:45:20.661683566 +0000 UTC m=+2063.833833453" lastFinishedPulling="2025-11-29 01:45:23.366088596 +0000 UTC m=+2066.538238453" observedRunningTime="2025-11-29 01:45:23.714518243 +0000 UTC m=+2066.886668110" watchObservedRunningTime="2025-11-29 01:45:23.721826418 +0000 UTC m=+2066.893976285" Nov 29 01:45:25 crc kubenswrapper[4749]: I1129 01:45:25.959388 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5tbcf"] Nov 29 01:45:25 crc kubenswrapper[4749]: I1129 01:45:25.961661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:25 crc kubenswrapper[4749]: I1129 01:45:25.984964 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tbcf"] Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.112033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-utilities\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.112570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-catalog-content\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.112624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqz6g\" (UniqueName: \"kubernetes.io/projected/92a7a4eb-eec2-4a51-9040-0a8133510cf6-kube-api-access-bqz6g\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.214778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-catalog-content\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.214848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqz6g\" (UniqueName: \"kubernetes.io/projected/92a7a4eb-eec2-4a51-9040-0a8133510cf6-kube-api-access-bqz6g\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.214887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-utilities\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.215281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-catalog-content\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.215506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-utilities\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.245894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqz6g\" (UniqueName: \"kubernetes.io/projected/92a7a4eb-eec2-4a51-9040-0a8133510cf6-kube-api-access-bqz6g\") pod \"redhat-operators-5tbcf\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.300382 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:26 crc kubenswrapper[4749]: I1129 01:45:26.822112 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5tbcf"] Nov 29 01:45:27 crc kubenswrapper[4749]: I1129 01:45:27.738601 4749 generic.go:334] "Generic (PLEG): container finished" podID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerID="2e451e3d7cc7595b316bf7bde2287acdfed459f4e529624bd132559bfff1a9fd" exitCode=0 Nov 29 01:45:27 crc kubenswrapper[4749]: I1129 01:45:27.738681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tbcf" event={"ID":"92a7a4eb-eec2-4a51-9040-0a8133510cf6","Type":"ContainerDied","Data":"2e451e3d7cc7595b316bf7bde2287acdfed459f4e529624bd132559bfff1a9fd"} Nov 29 01:45:27 crc kubenswrapper[4749]: I1129 01:45:27.739118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tbcf" event={"ID":"92a7a4eb-eec2-4a51-9040-0a8133510cf6","Type":"ContainerStarted","Data":"89adb600bf42d26b32f88e6cd0a04efac775e108a7ed0217c62392ea60f99c2f"} Nov 29 01:45:28 crc kubenswrapper[4749]: I1129 01:45:28.755950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tbcf" event={"ID":"92a7a4eb-eec2-4a51-9040-0a8133510cf6","Type":"ContainerStarted","Data":"69bc70e49b5ed3716f96b1d221833ab3e0da59f4583c5eefa0e85a8e195e856b"} Nov 29 01:45:29 crc kubenswrapper[4749]: I1129 01:45:29.325833 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:29 crc kubenswrapper[4749]: I1129 01:45:29.325917 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:29 crc kubenswrapper[4749]: I1129 01:45:29.389134 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:29 crc kubenswrapper[4749]: I1129 01:45:29.773782 4749 generic.go:334] "Generic (PLEG): container finished" podID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerID="69bc70e49b5ed3716f96b1d221833ab3e0da59f4583c5eefa0e85a8e195e856b" exitCode=0 Nov 29 01:45:29 crc kubenswrapper[4749]: I1129 01:45:29.773997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tbcf" event={"ID":"92a7a4eb-eec2-4a51-9040-0a8133510cf6","Type":"ContainerDied","Data":"69bc70e49b5ed3716f96b1d221833ab3e0da59f4583c5eefa0e85a8e195e856b"} Nov 29 01:45:29 crc kubenswrapper[4749]: I1129 01:45:29.838126 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:30 crc kubenswrapper[4749]: I1129 01:45:30.786562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tbcf" event={"ID":"92a7a4eb-eec2-4a51-9040-0a8133510cf6","Type":"ContainerStarted","Data":"979d5be90b500217ec4ab7376410029e13d2c7ea601d1a049af7378b05d2720c"} Nov 29 01:45:30 crc kubenswrapper[4749]: I1129 01:45:30.828346 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5tbcf" podStartSLOduration=3.359271433 podStartE2EDuration="5.828326044s" podCreationTimestamp="2025-11-29 01:45:25 +0000 UTC" firstStartedPulling="2025-11-29 01:45:27.74250988 +0000 UTC m=+2070.914659767" lastFinishedPulling="2025-11-29 01:45:30.211564481 +0000 UTC m=+2073.383714378" observedRunningTime="2025-11-29 01:45:30.814413132 +0000 UTC m=+2073.986563069" watchObservedRunningTime="2025-11-29 01:45:30.828326044 +0000 UTC m=+2074.000475911" Nov 29 01:45:31 crc kubenswrapper[4749]: I1129 01:45:31.733888 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgwpj"] Nov 29 01:45:31 crc kubenswrapper[4749]: I1129 01:45:31.793847 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xgwpj" podUID="bad68330-6939-48f7-b549-91cd1f650f00" containerName="registry-server" containerID="cri-o://9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1" gracePeriod=2 Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.236791 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.425168 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-catalog-content\") pod \"bad68330-6939-48f7-b549-91cd1f650f00\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.425622 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-utilities\") pod \"bad68330-6939-48f7-b549-91cd1f650f00\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.425730 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4jrl\" (UniqueName: \"kubernetes.io/projected/bad68330-6939-48f7-b549-91cd1f650f00-kube-api-access-t4jrl\") pod \"bad68330-6939-48f7-b549-91cd1f650f00\" (UID: \"bad68330-6939-48f7-b549-91cd1f650f00\") " Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.427644 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-utilities" (OuterVolumeSpecName: "utilities") pod "bad68330-6939-48f7-b549-91cd1f650f00" (UID: "bad68330-6939-48f7-b549-91cd1f650f00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.432895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad68330-6939-48f7-b549-91cd1f650f00-kube-api-access-t4jrl" (OuterVolumeSpecName: "kube-api-access-t4jrl") pod "bad68330-6939-48f7-b549-91cd1f650f00" (UID: "bad68330-6939-48f7-b549-91cd1f650f00"). InnerVolumeSpecName "kube-api-access-t4jrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.488618 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bad68330-6939-48f7-b549-91cd1f650f00" (UID: "bad68330-6939-48f7-b549-91cd1f650f00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.527448 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.527485 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bad68330-6939-48f7-b549-91cd1f650f00-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.527499 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4jrl\" (UniqueName: \"kubernetes.io/projected/bad68330-6939-48f7-b549-91cd1f650f00-kube-api-access-t4jrl\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.805837 4749 generic.go:334] "Generic (PLEG): container finished" podID="bad68330-6939-48f7-b549-91cd1f650f00" containerID="9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1" exitCode=0 Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.805897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgwpj" event={"ID":"bad68330-6939-48f7-b549-91cd1f650f00","Type":"ContainerDied","Data":"9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1"} Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.805938 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgwpj" event={"ID":"bad68330-6939-48f7-b549-91cd1f650f00","Type":"ContainerDied","Data":"337fbcbd2c8f4e54c4eaa5e7748326d93a31f657d797dd3bd9aa0fdd91fc6c96"} Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.805950 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgwpj" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.805969 4749 scope.go:117] "RemoveContainer" containerID="9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.845935 4749 scope.go:117] "RemoveContainer" containerID="90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.849497 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xgwpj"] Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.856893 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xgwpj"] Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.871154 4749 scope.go:117] "RemoveContainer" containerID="2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.900402 4749 scope.go:117] "RemoveContainer" containerID="9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1" Nov 29 01:45:32 crc kubenswrapper[4749]: E1129 01:45:32.901002 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1\": container with ID starting with 9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1 not found: ID does not exist" containerID="9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.901052 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1"} err="failed to get container status \"9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1\": rpc error: code = NotFound desc = could not find container \"9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1\": container with ID starting with 9867d83014582e6528de2a00e3d66cdc5d2bc106f5d61be58f7dc2469eea66d1 not found: ID does not exist" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.901090 4749 scope.go:117] "RemoveContainer" containerID="90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e" Nov 29 01:45:32 crc kubenswrapper[4749]: E1129 01:45:32.901658 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e\": container with ID starting with 90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e not found: ID does not exist" containerID="90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.901695 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e"} err="failed to get container status \"90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e\": rpc error: code = NotFound desc = could not find container \"90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e\": container with ID starting with 90d21d27e4a99cfc44384af10c8b0f0416781f7f15abf71cbab11686d7408d9e not found: ID does not exist" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.901721 4749 scope.go:117] "RemoveContainer" containerID="2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806" Nov 29 01:45:32 crc kubenswrapper[4749]: E1129 01:45:32.902221 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806\": container with ID starting with 2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806 not found: ID does not exist" containerID="2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806" Nov 29 01:45:32 crc kubenswrapper[4749]: I1129 01:45:32.902265 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806"} err="failed to get container status \"2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806\": rpc error: code = NotFound desc = could not find container \"2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806\": container with ID starting with 2f0a55c0d8ee0f2be8946bddcaa723d9ac4a6b02d9770d96ffb0e486cc708806 not found: ID does not exist" Nov 29 01:45:33 crc kubenswrapper[4749]: I1129 01:45:33.089616 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad68330-6939-48f7-b549-91cd1f650f00" path="/var/lib/kubelet/pods/bad68330-6939-48f7-b549-91cd1f650f00/volumes" Nov 29 01:45:36 crc kubenswrapper[4749]: I1129 01:45:36.300632 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:36 crc kubenswrapper[4749]: I1129 01:45:36.301467 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:37 crc kubenswrapper[4749]: I1129 01:45:37.356991 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5tbcf" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="registry-server" probeResult="failure" output=< Nov 29 01:45:37 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 01:45:37 crc kubenswrapper[4749]: > Nov 29 01:45:46 crc kubenswrapper[4749]: I1129 01:45:46.377965 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:46 crc kubenswrapper[4749]: I1129 01:45:46.444461 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:46 crc kubenswrapper[4749]: I1129 01:45:46.640725 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tbcf"] Nov 29 01:45:47 crc kubenswrapper[4749]: I1129 01:45:47.948725 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5tbcf" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="registry-server" containerID="cri-o://979d5be90b500217ec4ab7376410029e13d2c7ea601d1a049af7378b05d2720c" gracePeriod=2 Nov 29 01:45:49 crc kubenswrapper[4749]: I1129 01:45:49.975809 4749 generic.go:334] "Generic (PLEG): container finished" podID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerID="979d5be90b500217ec4ab7376410029e13d2c7ea601d1a049af7378b05d2720c" exitCode=0 Nov 29 01:45:49 crc kubenswrapper[4749]: I1129 01:45:49.975937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tbcf" event={"ID":"92a7a4eb-eec2-4a51-9040-0a8133510cf6","Type":"ContainerDied","Data":"979d5be90b500217ec4ab7376410029e13d2c7ea601d1a049af7378b05d2720c"} Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.233521 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.314731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqz6g\" (UniqueName: \"kubernetes.io/projected/92a7a4eb-eec2-4a51-9040-0a8133510cf6-kube-api-access-bqz6g\") pod \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.314830 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-catalog-content\") pod \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.315072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-utilities\") pod \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\" (UID: \"92a7a4eb-eec2-4a51-9040-0a8133510cf6\") " Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.316586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-utilities" (OuterVolumeSpecName: "utilities") pod "92a7a4eb-eec2-4a51-9040-0a8133510cf6" (UID: "92a7a4eb-eec2-4a51-9040-0a8133510cf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.321872 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a7a4eb-eec2-4a51-9040-0a8133510cf6-kube-api-access-bqz6g" (OuterVolumeSpecName: "kube-api-access-bqz6g") pod "92a7a4eb-eec2-4a51-9040-0a8133510cf6" (UID: "92a7a4eb-eec2-4a51-9040-0a8133510cf6"). InnerVolumeSpecName "kube-api-access-bqz6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.416943 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.416998 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqz6g\" (UniqueName: \"kubernetes.io/projected/92a7a4eb-eec2-4a51-9040-0a8133510cf6-kube-api-access-bqz6g\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.477832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92a7a4eb-eec2-4a51-9040-0a8133510cf6" (UID: "92a7a4eb-eec2-4a51-9040-0a8133510cf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.518125 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a7a4eb-eec2-4a51-9040-0a8133510cf6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.991686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5tbcf" event={"ID":"92a7a4eb-eec2-4a51-9040-0a8133510cf6","Type":"ContainerDied","Data":"89adb600bf42d26b32f88e6cd0a04efac775e108a7ed0217c62392ea60f99c2f"} Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.991737 4749 scope.go:117] "RemoveContainer" containerID="979d5be90b500217ec4ab7376410029e13d2c7ea601d1a049af7378b05d2720c" Nov 29 01:45:50 crc kubenswrapper[4749]: I1129 01:45:50.991765 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5tbcf" Nov 29 01:45:51 crc kubenswrapper[4749]: I1129 01:45:51.052385 4749 scope.go:117] "RemoveContainer" containerID="69bc70e49b5ed3716f96b1d221833ab3e0da59f4583c5eefa0e85a8e195e856b" Nov 29 01:45:51 crc kubenswrapper[4749]: I1129 01:45:51.058080 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5tbcf"] Nov 29 01:45:51 crc kubenswrapper[4749]: I1129 01:45:51.088983 4749 scope.go:117] "RemoveContainer" containerID="2e451e3d7cc7595b316bf7bde2287acdfed459f4e529624bd132559bfff1a9fd" Nov 29 01:45:51 crc kubenswrapper[4749]: I1129 01:45:51.097358 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5tbcf"] Nov 29 01:45:53 crc kubenswrapper[4749]: I1129 01:45:53.093081 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" path="/var/lib/kubelet/pods/92a7a4eb-eec2-4a51-9040-0a8133510cf6/volumes" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.897940 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fqhsd"] Nov 29 01:46:21 crc kubenswrapper[4749]: E1129 01:46:21.899131 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="extract-utilities" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.899153 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="extract-utilities" Nov 29 01:46:21 crc kubenswrapper[4749]: E1129 01:46:21.899222 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad68330-6939-48f7-b549-91cd1f650f00" containerName="extract-content" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.899237 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad68330-6939-48f7-b549-91cd1f650f00" containerName="extract-content" Nov 29 01:46:21 crc kubenswrapper[4749]: E1129 01:46:21.899255 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad68330-6939-48f7-b549-91cd1f650f00" containerName="extract-utilities" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.899268 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad68330-6939-48f7-b549-91cd1f650f00" containerName="extract-utilities" Nov 29 01:46:21 crc kubenswrapper[4749]: E1129 01:46:21.899295 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="registry-server" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.899309 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="registry-server" Nov 29 01:46:21 crc kubenswrapper[4749]: E1129 01:46:21.899341 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad68330-6939-48f7-b549-91cd1f650f00" containerName="registry-server" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.902258 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad68330-6939-48f7-b549-91cd1f650f00" containerName="registry-server" Nov 29 01:46:21 crc kubenswrapper[4749]: E1129 01:46:21.902346 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="extract-content" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.902362 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="extract-content" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.902688 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad68330-6939-48f7-b549-91cd1f650f00" containerName="registry-server" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.902726 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a7a4eb-eec2-4a51-9040-0a8133510cf6" containerName="registry-server" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.904006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.921739 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqhsd"] Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.934930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h624m\" (UniqueName: \"kubernetes.io/projected/860e4612-466d-45de-87f9-4c89d00cf663-kube-api-access-h624m\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.935389 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-catalog-content\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:21 crc kubenswrapper[4749]: I1129 01:46:21.936400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-utilities\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:22 crc kubenswrapper[4749]: I1129 01:46:22.037987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-catalog-content\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:22 crc kubenswrapper[4749]: I1129 01:46:22.038041 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-utilities\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:22 crc kubenswrapper[4749]: I1129 01:46:22.038162 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h624m\" (UniqueName: \"kubernetes.io/projected/860e4612-466d-45de-87f9-4c89d00cf663-kube-api-access-h624m\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:22 crc kubenswrapper[4749]: I1129 01:46:22.038717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-utilities\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:22 crc kubenswrapper[4749]: I1129 01:46:22.038737 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-catalog-content\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:22 crc kubenswrapper[4749]: I1129 01:46:22.065943 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h624m\" (UniqueName: \"kubernetes.io/projected/860e4612-466d-45de-87f9-4c89d00cf663-kube-api-access-h624m\") pod \"redhat-marketplace-fqhsd\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:22 crc kubenswrapper[4749]: I1129 01:46:22.233800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:22 crc kubenswrapper[4749]: I1129 01:46:22.680827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqhsd"] Nov 29 01:46:23 crc kubenswrapper[4749]: I1129 01:46:23.323949 4749 generic.go:334] "Generic (PLEG): container finished" podID="860e4612-466d-45de-87f9-4c89d00cf663" containerID="437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150" exitCode=0 Nov 29 01:46:23 crc kubenswrapper[4749]: I1129 01:46:23.324051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqhsd" event={"ID":"860e4612-466d-45de-87f9-4c89d00cf663","Type":"ContainerDied","Data":"437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150"} Nov 29 01:46:23 crc kubenswrapper[4749]: I1129 01:46:23.324377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqhsd" event={"ID":"860e4612-466d-45de-87f9-4c89d00cf663","Type":"ContainerStarted","Data":"9007e8f0fe7a511c174ec9f3f27672d23314e6340e08b1b4e33772ff17cb8e88"} Nov 29 01:46:24 crc kubenswrapper[4749]: I1129 01:46:24.333988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqhsd" event={"ID":"860e4612-466d-45de-87f9-4c89d00cf663","Type":"ContainerStarted","Data":"798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48"} Nov 29 01:46:25 crc kubenswrapper[4749]: I1129 01:46:25.347311 4749 generic.go:334] "Generic (PLEG): container finished" podID="860e4612-466d-45de-87f9-4c89d00cf663" containerID="798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48" exitCode=0 Nov 29 01:46:25 crc kubenswrapper[4749]: I1129 01:46:25.347381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqhsd" event={"ID":"860e4612-466d-45de-87f9-4c89d00cf663","Type":"ContainerDied","Data":"798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48"} Nov 29 01:46:26 crc kubenswrapper[4749]: I1129 01:46:26.359626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqhsd" event={"ID":"860e4612-466d-45de-87f9-4c89d00cf663","Type":"ContainerStarted","Data":"b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7"} Nov 29 01:46:26 crc kubenswrapper[4749]: I1129 01:46:26.383990 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fqhsd" podStartSLOduration=2.9450634129999997 podStartE2EDuration="5.383972877s" podCreationTimestamp="2025-11-29 01:46:21 +0000 UTC" firstStartedPulling="2025-11-29 01:46:23.325586625 +0000 UTC m=+2126.497736492" lastFinishedPulling="2025-11-29 01:46:25.764496099 +0000 UTC m=+2128.936645956" observedRunningTime="2025-11-29 01:46:26.377875111 +0000 UTC m=+2129.550024978" watchObservedRunningTime="2025-11-29 01:46:26.383972877 +0000 UTC m=+2129.556122764" Nov 29 01:46:32 crc kubenswrapper[4749]: I1129 01:46:32.234823 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:32 crc kubenswrapper[4749]: I1129 01:46:32.235123 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:32 crc kubenswrapper[4749]: I1129 01:46:32.316631 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:32 crc kubenswrapper[4749]: I1129 01:46:32.485882 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:32 crc kubenswrapper[4749]: I1129 01:46:32.565389 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqhsd"] Nov 29 01:46:34 crc kubenswrapper[4749]: I1129 01:46:34.431399 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fqhsd" podUID="860e4612-466d-45de-87f9-4c89d00cf663" containerName="registry-server" containerID="cri-o://b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7" gracePeriod=2 Nov 29 01:46:34 crc kubenswrapper[4749]: I1129 01:46:34.981537 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.145074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-utilities\") pod \"860e4612-466d-45de-87f9-4c89d00cf663\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.145171 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-catalog-content\") pod \"860e4612-466d-45de-87f9-4c89d00cf663\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.145240 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h624m\" (UniqueName: \"kubernetes.io/projected/860e4612-466d-45de-87f9-4c89d00cf663-kube-api-access-h624m\") pod \"860e4612-466d-45de-87f9-4c89d00cf663\" (UID: \"860e4612-466d-45de-87f9-4c89d00cf663\") " Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.147349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-utilities" (OuterVolumeSpecName: "utilities") pod "860e4612-466d-45de-87f9-4c89d00cf663" (UID: "860e4612-466d-45de-87f9-4c89d00cf663"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.154399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860e4612-466d-45de-87f9-4c89d00cf663-kube-api-access-h624m" (OuterVolumeSpecName: "kube-api-access-h624m") pod "860e4612-466d-45de-87f9-4c89d00cf663" (UID: "860e4612-466d-45de-87f9-4c89d00cf663"). InnerVolumeSpecName "kube-api-access-h624m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.171467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "860e4612-466d-45de-87f9-4c89d00cf663" (UID: "860e4612-466d-45de-87f9-4c89d00cf663"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.246576 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.246617 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/860e4612-466d-45de-87f9-4c89d00cf663-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.246637 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h624m\" (UniqueName: \"kubernetes.io/projected/860e4612-466d-45de-87f9-4c89d00cf663-kube-api-access-h624m\") on node \"crc\" DevicePath \"\"" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.443194 4749 generic.go:334] "Generic (PLEG): container finished" podID="860e4612-466d-45de-87f9-4c89d00cf663" containerID="b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7" exitCode=0 Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.443306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqhsd" event={"ID":"860e4612-466d-45de-87f9-4c89d00cf663","Type":"ContainerDied","Data":"b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7"} Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.443345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqhsd" event={"ID":"860e4612-466d-45de-87f9-4c89d00cf663","Type":"ContainerDied","Data":"9007e8f0fe7a511c174ec9f3f27672d23314e6340e08b1b4e33772ff17cb8e88"} Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.443377 4749 scope.go:117] "RemoveContainer" containerID="b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.443429 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqhsd" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.472434 4749 scope.go:117] "RemoveContainer" containerID="798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.513266 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqhsd"] Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.523785 4749 scope.go:117] "RemoveContainer" containerID="437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.525924 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqhsd"] Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.556149 4749 scope.go:117] "RemoveContainer" containerID="b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7" Nov 29 01:46:35 crc kubenswrapper[4749]: E1129 01:46:35.561213 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7\": container with ID starting with b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7 not found: ID does not exist" containerID="b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.561245 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7"} err="failed to get container status \"b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7\": rpc error: code = NotFound desc = could not find container \"b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7\": container with ID starting with b8a98d134e91aeb9fa858fc6970fab09c7fe30e2a50e94f3fc710db4542819c7 not found: ID does not exist" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.561270 4749 scope.go:117] "RemoveContainer" containerID="798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48" Nov 29 01:46:35 crc kubenswrapper[4749]: E1129 01:46:35.561914 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48\": container with ID starting with 798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48 not found: ID does not exist" containerID="798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.561939 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48"} err="failed to get container status \"798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48\": rpc error: code = NotFound desc = could not find container \"798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48\": container with ID starting with 798ff91189a06906c11d7176173736fe907e374cc6974a9b09ea6c978cb45d48 not found: ID does not exist" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.561955 4749 scope.go:117] "RemoveContainer" containerID="437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150" Nov 29 01:46:35 crc kubenswrapper[4749]: E1129 01:46:35.562423 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150\": container with ID starting with 437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150 not found: ID does not exist" containerID="437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150" Nov 29 01:46:35 crc kubenswrapper[4749]: I1129 01:46:35.562451 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150"} err="failed to get container status \"437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150\": rpc error: code = NotFound desc = could not find container \"437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150\": container with ID starting with 437b8edc8d577089586f0a99937b0e8735e1aa34dd17f790c7421336637d4150 not found: ID does not exist" Nov 29 01:46:37 crc kubenswrapper[4749]: I1129 01:46:37.091783 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860e4612-466d-45de-87f9-4c89d00cf663" path="/var/lib/kubelet/pods/860e4612-466d-45de-87f9-4c89d00cf663/volumes" Nov 29 01:46:55 crc kubenswrapper[4749]: I1129 01:46:55.374044 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:46:55 crc kubenswrapper[4749]: I1129 01:46:55.374993 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:47:25 crc kubenswrapper[4749]: I1129 01:47:25.374289 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:47:25 crc kubenswrapper[4749]: I1129 01:47:25.375442 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:47:55 crc kubenswrapper[4749]: I1129 01:47:55.373945 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:47:55 crc kubenswrapper[4749]: I1129 01:47:55.376872 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:47:55 crc kubenswrapper[4749]: I1129 01:47:55.377256 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:47:55 crc kubenswrapper[4749]: I1129 01:47:55.378231 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:47:55 crc kubenswrapper[4749]: I1129 01:47:55.378508 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" gracePeriod=600 Nov 29 01:47:55 crc kubenswrapper[4749]: E1129 01:47:55.516088 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:47:56 crc kubenswrapper[4749]: I1129 01:47:56.231648 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" exitCode=0 Nov 29 01:47:56 crc kubenswrapper[4749]: I1129 01:47:56.231760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89"} Nov 29 01:47:56 crc kubenswrapper[4749]: I1129 01:47:56.232026 4749 scope.go:117] "RemoveContainer" containerID="08ef43bb07b3609728cc9d9990db3acd5c68f06e8ff6480babb6cfffdb449dc5" Nov 29 01:47:56 crc kubenswrapper[4749]: I1129 01:47:56.232897 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:47:56 crc kubenswrapper[4749]: E1129 01:47:56.233389 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:48:08 crc kubenswrapper[4749]: I1129 01:48:08.074653 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:48:08 crc kubenswrapper[4749]: E1129 01:48:08.075261 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:48:23 crc kubenswrapper[4749]: I1129 01:48:23.075675 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:48:23 crc kubenswrapper[4749]: E1129 01:48:23.076435 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:48:35 crc kubenswrapper[4749]: I1129 01:48:35.075454 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:48:35 crc kubenswrapper[4749]: E1129 01:48:35.076320 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:48:50 crc kubenswrapper[4749]: I1129 01:48:50.075058 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:48:50 crc kubenswrapper[4749]: E1129 01:48:50.076005 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:49:02 crc kubenswrapper[4749]: I1129 01:49:02.075256 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:49:02 crc kubenswrapper[4749]: E1129 01:49:02.076534 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:49:16 crc kubenswrapper[4749]: I1129 01:49:16.075559 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:49:16 crc kubenswrapper[4749]: E1129 01:49:16.077909 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:49:27 crc kubenswrapper[4749]: I1129 01:49:27.083042 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:49:27 crc kubenswrapper[4749]: E1129 01:49:27.084113 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:49:41 crc kubenswrapper[4749]: I1129 01:49:41.076017 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:49:41 crc kubenswrapper[4749]: E1129 01:49:41.076999 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:49:54 crc kubenswrapper[4749]: I1129 01:49:54.075590 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:49:54 crc kubenswrapper[4749]: E1129 01:49:54.076486 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:50:05 crc kubenswrapper[4749]: I1129 01:50:05.075624 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:50:05 crc kubenswrapper[4749]: E1129 01:50:05.076578 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.334134 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7947r"] Nov 29 01:50:08 crc kubenswrapper[4749]: E1129 01:50:08.334647 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860e4612-466d-45de-87f9-4c89d00cf663" containerName="extract-content" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.334669 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="860e4612-466d-45de-87f9-4c89d00cf663" containerName="extract-content" Nov 29 01:50:08 crc kubenswrapper[4749]: E1129 01:50:08.334710 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860e4612-466d-45de-87f9-4c89d00cf663" containerName="registry-server" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.334723 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="860e4612-466d-45de-87f9-4c89d00cf663" containerName="registry-server" Nov 29 01:50:08 crc kubenswrapper[4749]: E1129 01:50:08.334746 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860e4612-466d-45de-87f9-4c89d00cf663" containerName="extract-utilities" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.334759 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="860e4612-466d-45de-87f9-4c89d00cf663" containerName="extract-utilities" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.335023 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="860e4612-466d-45de-87f9-4c89d00cf663" containerName="registry-server" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.337035 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.355716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7947r"] Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.363649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwpdv\" (UniqueName: \"kubernetes.io/projected/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-kube-api-access-xwpdv\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.363822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-catalog-content\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.363985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-utilities\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.465468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-catalog-content\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.465829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-utilities\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.465950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwpdv\" (UniqueName: \"kubernetes.io/projected/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-kube-api-access-xwpdv\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.466015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-catalog-content\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.466430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-utilities\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.494568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwpdv\" (UniqueName: \"kubernetes.io/projected/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-kube-api-access-xwpdv\") pod \"certified-operators-7947r\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.671306 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:08 crc kubenswrapper[4749]: I1129 01:50:08.948658 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7947r"] Nov 29 01:50:09 crc kubenswrapper[4749]: I1129 01:50:09.425261 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerID="9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886" exitCode=0 Nov 29 01:50:09 crc kubenswrapper[4749]: I1129 01:50:09.425335 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7947r" event={"ID":"ce8f4990-5d2b-4029-8bc3-b15c153c79cc","Type":"ContainerDied","Data":"9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886"} Nov 29 01:50:09 crc kubenswrapper[4749]: I1129 01:50:09.425377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7947r" event={"ID":"ce8f4990-5d2b-4029-8bc3-b15c153c79cc","Type":"ContainerStarted","Data":"5b9c442d0f5a86fa735c0150662484d1d4aaa1bab8377db77d89b0d37b5be1ac"} Nov 29 01:50:10 crc kubenswrapper[4749]: I1129 01:50:10.434235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7947r" event={"ID":"ce8f4990-5d2b-4029-8bc3-b15c153c79cc","Type":"ContainerStarted","Data":"62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d"} Nov 29 01:50:10 crc kubenswrapper[4749]: E1129 01:50:10.734161 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce8f4990_5d2b_4029_8bc3_b15c153c79cc.slice/crio-62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d.scope\": RecentStats: unable to find data in memory cache]" Nov 29 01:50:11 crc kubenswrapper[4749]: I1129 01:50:11.448705 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerID="62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d" exitCode=0 Nov 29 01:50:11 crc kubenswrapper[4749]: I1129 01:50:11.448860 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7947r" event={"ID":"ce8f4990-5d2b-4029-8bc3-b15c153c79cc","Type":"ContainerDied","Data":"62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d"} Nov 29 01:50:12 crc kubenswrapper[4749]: I1129 01:50:12.463884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7947r" event={"ID":"ce8f4990-5d2b-4029-8bc3-b15c153c79cc","Type":"ContainerStarted","Data":"f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4"} Nov 29 01:50:12 crc kubenswrapper[4749]: I1129 01:50:12.497813 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7947r" podStartSLOduration=2.015079526 podStartE2EDuration="4.49778785s" podCreationTimestamp="2025-11-29 01:50:08 +0000 UTC" firstStartedPulling="2025-11-29 01:50:09.427944399 +0000 UTC m=+2352.600094286" lastFinishedPulling="2025-11-29 01:50:11.910652713 +0000 UTC m=+2355.082802610" observedRunningTime="2025-11-29 01:50:12.490061882 +0000 UTC m=+2355.662211779" watchObservedRunningTime="2025-11-29 01:50:12.49778785 +0000 UTC m=+2355.669937737" Nov 29 01:50:16 crc kubenswrapper[4749]: I1129 01:50:16.076375 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:50:16 crc kubenswrapper[4749]: E1129 01:50:16.076652 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:50:18 crc kubenswrapper[4749]: I1129 01:50:18.672472 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:18 crc kubenswrapper[4749]: I1129 01:50:18.672867 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:18 crc kubenswrapper[4749]: I1129 01:50:18.726847 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:19 crc kubenswrapper[4749]: I1129 01:50:19.579744 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:19 crc kubenswrapper[4749]: I1129 01:50:19.647220 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7947r"] Nov 29 01:50:21 crc kubenswrapper[4749]: I1129 01:50:21.562623 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7947r" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerName="registry-server" containerID="cri-o://f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4" gracePeriod=2 Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.503968 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.569890 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerID="f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4" exitCode=0 Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.569942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7947r" event={"ID":"ce8f4990-5d2b-4029-8bc3-b15c153c79cc","Type":"ContainerDied","Data":"f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4"} Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.569956 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7947r" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.569978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7947r" event={"ID":"ce8f4990-5d2b-4029-8bc3-b15c153c79cc","Type":"ContainerDied","Data":"5b9c442d0f5a86fa735c0150662484d1d4aaa1bab8377db77d89b0d37b5be1ac"} Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.570018 4749 scope.go:117] "RemoveContainer" containerID="f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.591586 4749 scope.go:117] "RemoveContainer" containerID="62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.613520 4749 scope.go:117] "RemoveContainer" containerID="9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.630894 4749 scope.go:117] "RemoveContainer" containerID="f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.631236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-utilities\") pod \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.631281 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwpdv\" (UniqueName: \"kubernetes.io/projected/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-kube-api-access-xwpdv\") pod \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.631425 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-catalog-content\") pod \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\" (UID: \"ce8f4990-5d2b-4029-8bc3-b15c153c79cc\") " Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.632509 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-utilities" (OuterVolumeSpecName: "utilities") pod "ce8f4990-5d2b-4029-8bc3-b15c153c79cc" (UID: "ce8f4990-5d2b-4029-8bc3-b15c153c79cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:50:22 crc kubenswrapper[4749]: E1129 01:50:22.633019 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4\": container with ID starting with f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4 not found: ID does not exist" containerID="f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.633046 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4"} err="failed to get container status \"f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4\": rpc error: code = NotFound desc = could not find container \"f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4\": container with ID starting with f79e90a86c07881950168bb14cb600746bc1a37d08d78f07e84077f908a08fa4 not found: ID does not exist" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.633104 4749 scope.go:117] "RemoveContainer" containerID="62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d" Nov 29 01:50:22 crc kubenswrapper[4749]: E1129 01:50:22.633539 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d\": container with ID starting with 62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d not found: ID does not exist" containerID="62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.633562 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d"} err="failed to get container status \"62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d\": rpc error: code = NotFound desc = could not find container \"62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d\": container with ID starting with 62f8572efb1e0098be1391287a218e791679e6d69016dd8ea86eedff0e57282d not found: ID does not exist" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.633578 4749 scope.go:117] "RemoveContainer" containerID="9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886" Nov 29 01:50:22 crc kubenswrapper[4749]: E1129 01:50:22.633807 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886\": container with ID starting with 9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886 not found: ID does not exist" containerID="9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.633828 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886"} err="failed to get container status \"9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886\": rpc error: code = NotFound desc = could not find container \"9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886\": container with ID starting with 9d54a37683161dc6508a1f276893627bd692379527dd4fd84e569ec951373886 not found: ID does not exist" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.639278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-kube-api-access-xwpdv" (OuterVolumeSpecName: "kube-api-access-xwpdv") pod "ce8f4990-5d2b-4029-8bc3-b15c153c79cc" (UID: "ce8f4990-5d2b-4029-8bc3-b15c153c79cc"). InnerVolumeSpecName "kube-api-access-xwpdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.684618 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce8f4990-5d2b-4029-8bc3-b15c153c79cc" (UID: "ce8f4990-5d2b-4029-8bc3-b15c153c79cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.733069 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.733119 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.733657 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwpdv\" (UniqueName: \"kubernetes.io/projected/ce8f4990-5d2b-4029-8bc3-b15c153c79cc-kube-api-access-xwpdv\") on node \"crc\" DevicePath \"\"" Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.935528 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7947r"] Nov 29 01:50:22 crc kubenswrapper[4749]: I1129 01:50:22.945739 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7947r"] Nov 29 01:50:23 crc kubenswrapper[4749]: I1129 01:50:23.088781 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" path="/var/lib/kubelet/pods/ce8f4990-5d2b-4029-8bc3-b15c153c79cc/volumes" Nov 29 01:50:28 crc kubenswrapper[4749]: I1129 01:50:28.074716 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:50:28 crc kubenswrapper[4749]: E1129 01:50:28.075403 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:50:42 crc kubenswrapper[4749]: I1129 01:50:42.075371 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:50:42 crc kubenswrapper[4749]: E1129 01:50:42.076303 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:50:57 crc kubenswrapper[4749]: I1129 01:50:57.083868 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:50:57 crc kubenswrapper[4749]: E1129 01:50:57.085896 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:51:12 crc kubenswrapper[4749]: I1129 01:51:12.074902 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:51:12 crc kubenswrapper[4749]: E1129 01:51:12.075938 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:51:26 crc kubenswrapper[4749]: I1129 01:51:26.076090 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:51:26 crc kubenswrapper[4749]: E1129 01:51:26.077102 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:51:38 crc kubenswrapper[4749]: I1129 01:51:38.074876 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:51:38 crc kubenswrapper[4749]: E1129 01:51:38.075690 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:51:49 crc kubenswrapper[4749]: I1129 01:51:49.075761 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:51:49 crc kubenswrapper[4749]: E1129 01:51:49.076697 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:52:03 crc kubenswrapper[4749]: I1129 01:52:03.075702 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:52:03 crc kubenswrapper[4749]: E1129 01:52:03.076586 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:52:16 crc kubenswrapper[4749]: I1129 01:52:16.075557 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:52:16 crc kubenswrapper[4749]: E1129 01:52:16.076647 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:52:31 crc kubenswrapper[4749]: I1129 01:52:31.075676 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:52:31 crc kubenswrapper[4749]: E1129 01:52:31.076587 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:52:46 crc kubenswrapper[4749]: I1129 01:52:46.075095 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:52:46 crc kubenswrapper[4749]: E1129 01:52:46.076384 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:52:59 crc kubenswrapper[4749]: I1129 01:52:59.075501 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:53:00 crc kubenswrapper[4749]: I1129 01:53:00.086875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"ef519bfd651d7d6b3ad85ddd14a74f376232c09fb83b6e9d86cf00eb0a7b8112"} Nov 29 01:55:25 crc kubenswrapper[4749]: I1129 01:55:25.373980 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:55:25 crc kubenswrapper[4749]: I1129 01:55:25.376495 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:55:55 crc kubenswrapper[4749]: I1129 01:55:55.374617 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:55:55 crc kubenswrapper[4749]: I1129 01:55:55.376973 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.012182 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2d24f"] Nov 29 01:56:07 crc kubenswrapper[4749]: E1129 01:56:07.013580 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerName="extract-utilities" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.013611 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerName="extract-utilities" Nov 29 01:56:07 crc kubenswrapper[4749]: E1129 01:56:07.013666 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerName="extract-content" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.013687 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerName="extract-content" Nov 29 01:56:07 crc kubenswrapper[4749]: E1129 01:56:07.013732 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerName="registry-server" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.013750 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerName="registry-server" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.014128 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8f4990-5d2b-4029-8bc3-b15c153c79cc" containerName="registry-server" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.017547 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.024791 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2d24f"] Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.174984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4m8c\" (UniqueName: \"kubernetes.io/projected/4918a3f2-e8cb-46c6-8df4-9040881aebef-kube-api-access-r4m8c\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.175378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-utilities\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.175422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-catalog-content\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.276631 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-utilities\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.276719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-catalog-content\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.276839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4m8c\" (UniqueName: \"kubernetes.io/projected/4918a3f2-e8cb-46c6-8df4-9040881aebef-kube-api-access-r4m8c\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.277399 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-catalog-content\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.277393 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-utilities\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.316419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4m8c\" (UniqueName: \"kubernetes.io/projected/4918a3f2-e8cb-46c6-8df4-9040881aebef-kube-api-access-r4m8c\") pod \"redhat-operators-2d24f\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.353742 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.619814 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2d24f"] Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.805163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d24f" event={"ID":"4918a3f2-e8cb-46c6-8df4-9040881aebef","Type":"ContainerStarted","Data":"97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c"} Nov 29 01:56:07 crc kubenswrapper[4749]: I1129 01:56:07.805232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d24f" event={"ID":"4918a3f2-e8cb-46c6-8df4-9040881aebef","Type":"ContainerStarted","Data":"d1e37e4bacd2d9b2715614b2ed7aa667645b4582d887306753f62236433fc131"} Nov 29 01:56:08 crc kubenswrapper[4749]: I1129 01:56:08.816518 4749 generic.go:334] "Generic (PLEG): container finished" podID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerID="97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c" exitCode=0 Nov 29 01:56:08 crc kubenswrapper[4749]: I1129 01:56:08.816569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d24f" event={"ID":"4918a3f2-e8cb-46c6-8df4-9040881aebef","Type":"ContainerDied","Data":"97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c"} Nov 29 01:56:08 crc kubenswrapper[4749]: I1129 01:56:08.819305 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 01:56:10 crc kubenswrapper[4749]: I1129 01:56:10.841786 4749 generic.go:334] "Generic (PLEG): container finished" podID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerID="db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6" exitCode=0 Nov 29 01:56:10 crc kubenswrapper[4749]: I1129 01:56:10.841868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d24f" event={"ID":"4918a3f2-e8cb-46c6-8df4-9040881aebef","Type":"ContainerDied","Data":"db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6"} Nov 29 01:56:11 crc kubenswrapper[4749]: I1129 01:56:11.854905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d24f" event={"ID":"4918a3f2-e8cb-46c6-8df4-9040881aebef","Type":"ContainerStarted","Data":"4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0"} Nov 29 01:56:11 crc kubenswrapper[4749]: I1129 01:56:11.885673 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2d24f" podStartSLOduration=3.437688097 podStartE2EDuration="5.885652368s" podCreationTimestamp="2025-11-29 01:56:06 +0000 UTC" firstStartedPulling="2025-11-29 01:56:08.818884705 +0000 UTC m=+2711.991034602" lastFinishedPulling="2025-11-29 01:56:11.266849016 +0000 UTC m=+2714.438998873" observedRunningTime="2025-11-29 01:56:11.882674076 +0000 UTC m=+2715.054823933" watchObservedRunningTime="2025-11-29 01:56:11.885652368 +0000 UTC m=+2715.057802245" Nov 29 01:56:17 crc kubenswrapper[4749]: I1129 01:56:17.355693 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:17 crc kubenswrapper[4749]: I1129 01:56:17.356038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:18 crc kubenswrapper[4749]: I1129 01:56:18.436965 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2d24f" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="registry-server" probeResult="failure" output=< Nov 29 01:56:18 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 01:56:18 crc kubenswrapper[4749]: > Nov 29 01:56:25 crc kubenswrapper[4749]: I1129 01:56:25.374664 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:56:25 crc kubenswrapper[4749]: I1129 01:56:25.375233 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:56:25 crc kubenswrapper[4749]: I1129 01:56:25.375290 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:56:25 crc kubenswrapper[4749]: I1129 01:56:25.375989 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef519bfd651d7d6b3ad85ddd14a74f376232c09fb83b6e9d86cf00eb0a7b8112"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:56:25 crc kubenswrapper[4749]: I1129 01:56:25.376062 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://ef519bfd651d7d6b3ad85ddd14a74f376232c09fb83b6e9d86cf00eb0a7b8112" gracePeriod=600 Nov 29 01:56:26 crc kubenswrapper[4749]: I1129 01:56:26.983585 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="ef519bfd651d7d6b3ad85ddd14a74f376232c09fb83b6e9d86cf00eb0a7b8112" exitCode=0 Nov 29 01:56:26 crc kubenswrapper[4749]: I1129 01:56:26.983662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"ef519bfd651d7d6b3ad85ddd14a74f376232c09fb83b6e9d86cf00eb0a7b8112"} Nov 29 01:56:26 crc kubenswrapper[4749]: I1129 01:56:26.984226 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562"} Nov 29 01:56:26 crc kubenswrapper[4749]: I1129 01:56:26.984250 4749 scope.go:117] "RemoveContainer" containerID="f600eee0674c2766afd849d98f727f2598dd1ffbe100f34d6051207092d1cf89" Nov 29 01:56:27 crc kubenswrapper[4749]: I1129 01:56:27.410254 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:27 crc kubenswrapper[4749]: I1129 01:56:27.486697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:27 crc kubenswrapper[4749]: I1129 01:56:27.696511 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2d24f"] Nov 29 01:56:28 crc kubenswrapper[4749]: I1129 01:56:28.999087 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2d24f" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="registry-server" containerID="cri-o://4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0" gracePeriod=2 Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.471455 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.648652 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-catalog-content\") pod \"4918a3f2-e8cb-46c6-8df4-9040881aebef\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.648827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-utilities\") pod \"4918a3f2-e8cb-46c6-8df4-9040881aebef\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.648922 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4m8c\" (UniqueName: \"kubernetes.io/projected/4918a3f2-e8cb-46c6-8df4-9040881aebef-kube-api-access-r4m8c\") pod \"4918a3f2-e8cb-46c6-8df4-9040881aebef\" (UID: \"4918a3f2-e8cb-46c6-8df4-9040881aebef\") " Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.650498 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-utilities" (OuterVolumeSpecName: "utilities") pod "4918a3f2-e8cb-46c6-8df4-9040881aebef" (UID: "4918a3f2-e8cb-46c6-8df4-9040881aebef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.660659 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4918a3f2-e8cb-46c6-8df4-9040881aebef-kube-api-access-r4m8c" (OuterVolumeSpecName: "kube-api-access-r4m8c") pod "4918a3f2-e8cb-46c6-8df4-9040881aebef" (UID: "4918a3f2-e8cb-46c6-8df4-9040881aebef"). InnerVolumeSpecName "kube-api-access-r4m8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.750601 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.750645 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4m8c\" (UniqueName: \"kubernetes.io/projected/4918a3f2-e8cb-46c6-8df4-9040881aebef-kube-api-access-r4m8c\") on node \"crc\" DevicePath \"\"" Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.794959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4918a3f2-e8cb-46c6-8df4-9040881aebef" (UID: "4918a3f2-e8cb-46c6-8df4-9040881aebef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:56:29 crc kubenswrapper[4749]: I1129 01:56:29.852292 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4918a3f2-e8cb-46c6-8df4-9040881aebef-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.009945 4749 generic.go:334] "Generic (PLEG): container finished" podID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerID="4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0" exitCode=0 Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.010032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d24f" event={"ID":"4918a3f2-e8cb-46c6-8df4-9040881aebef","Type":"ContainerDied","Data":"4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0"} Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.010111 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d24f" event={"ID":"4918a3f2-e8cb-46c6-8df4-9040881aebef","Type":"ContainerDied","Data":"d1e37e4bacd2d9b2715614b2ed7aa667645b4582d887306753f62236433fc131"} Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.010119 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d24f" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.010146 4749 scope.go:117] "RemoveContainer" containerID="4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.046061 4749 scope.go:117] "RemoveContainer" containerID="db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.066545 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2d24f"] Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.074970 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2d24f"] Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.095677 4749 scope.go:117] "RemoveContainer" containerID="97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.125953 4749 scope.go:117] "RemoveContainer" containerID="4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0" Nov 29 01:56:30 crc kubenswrapper[4749]: E1129 01:56:30.126746 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0\": container with ID starting with 4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0 not found: ID does not exist" containerID="4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.126820 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0"} err="failed to get container status \"4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0\": rpc error: code = NotFound desc = could not find container \"4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0\": container with ID starting with 4d08646b3b0d204c21323616aa67895fea87dddd7baaf0b72eebeb02add061c0 not found: ID does not exist" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.126869 4749 scope.go:117] "RemoveContainer" containerID="db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6" Nov 29 01:56:30 crc kubenswrapper[4749]: E1129 01:56:30.127552 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6\": container with ID starting with db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6 not found: ID does not exist" containerID="db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.127624 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6"} err="failed to get container status \"db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6\": rpc error: code = NotFound desc = could not find container \"db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6\": container with ID starting with db475d2bbb5525d1b008bb3e885640569380b76784c39f4540519763aae327b6 not found: ID does not exist" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.127668 4749 scope.go:117] "RemoveContainer" containerID="97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c" Nov 29 01:56:30 crc kubenswrapper[4749]: E1129 01:56:30.128108 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c\": container with ID starting with 97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c not found: ID does not exist" containerID="97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c" Nov 29 01:56:30 crc kubenswrapper[4749]: I1129 01:56:30.128167 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c"} err="failed to get container status \"97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c\": rpc error: code = NotFound desc = could not find container \"97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c\": container with ID starting with 97161dead18c9d2628e0a115f1eb382caeb1d42220b563a5caebc13ab7bd3e0c not found: ID does not exist" Nov 29 01:56:31 crc kubenswrapper[4749]: I1129 01:56:31.091082 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" path="/var/lib/kubelet/pods/4918a3f2-e8cb-46c6-8df4-9040881aebef/volumes" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.387174 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9m7j8"] Nov 29 01:57:22 crc kubenswrapper[4749]: E1129 01:57:22.390178 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="registry-server" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.390423 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="registry-server" Nov 29 01:57:22 crc kubenswrapper[4749]: E1129 01:57:22.390580 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="extract-utilities" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.390746 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="extract-utilities" Nov 29 01:57:22 crc kubenswrapper[4749]: E1129 01:57:22.390969 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="extract-content" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.391150 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="extract-content" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.391719 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4918a3f2-e8cb-46c6-8df4-9040881aebef" containerName="registry-server" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.396662 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.397268 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9m7j8"] Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.528327 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-utilities\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.528670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m97d\" (UniqueName: \"kubernetes.io/projected/7cecfc05-d29e-4174-a9e5-b5dfabe17747-kube-api-access-2m97d\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.528697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-catalog-content\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.630233 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m97d\" (UniqueName: \"kubernetes.io/projected/7cecfc05-d29e-4174-a9e5-b5dfabe17747-kube-api-access-2m97d\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.630289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-catalog-content\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.630358 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-utilities\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.630856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-utilities\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.631083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-catalog-content\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.667025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m97d\" (UniqueName: \"kubernetes.io/projected/7cecfc05-d29e-4174-a9e5-b5dfabe17747-kube-api-access-2m97d\") pod \"community-operators-9m7j8\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:22 crc kubenswrapper[4749]: I1129 01:57:22.767270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.220232 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9m7j8"] Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.517516 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerID="7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96" exitCode=0 Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.517584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9m7j8" event={"ID":"7cecfc05-d29e-4174-a9e5-b5dfabe17747","Type":"ContainerDied","Data":"7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96"} Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.517626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9m7j8" event={"ID":"7cecfc05-d29e-4174-a9e5-b5dfabe17747","Type":"ContainerStarted","Data":"b310b8d77362e293f21642ded960d9c053350364987ded54c54d5922f5b3e0f7"} Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.789481 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cws5d"] Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.793176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.820255 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cws5d"] Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.955022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk2sw\" (UniqueName: \"kubernetes.io/projected/85040bcc-02f5-45d8-9752-abfa6db8af6a-kube-api-access-nk2sw\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.955461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-utilities\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:23 crc kubenswrapper[4749]: I1129 01:57:23.955610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-catalog-content\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.057116 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk2sw\" (UniqueName: \"kubernetes.io/projected/85040bcc-02f5-45d8-9752-abfa6db8af6a-kube-api-access-nk2sw\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.057611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-utilities\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.057737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-catalog-content\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.058322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-utilities\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.058369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-catalog-content\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.085349 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk2sw\" (UniqueName: \"kubernetes.io/projected/85040bcc-02f5-45d8-9752-abfa6db8af6a-kube-api-access-nk2sw\") pod \"redhat-marketplace-cws5d\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.117265 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.530814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9m7j8" event={"ID":"7cecfc05-d29e-4174-a9e5-b5dfabe17747","Type":"ContainerStarted","Data":"5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b"} Nov 29 01:57:24 crc kubenswrapper[4749]: I1129 01:57:24.604109 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cws5d"] Nov 29 01:57:25 crc kubenswrapper[4749]: I1129 01:57:25.542982 4749 generic.go:334] "Generic (PLEG): container finished" podID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerID="af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7" exitCode=0 Nov 29 01:57:25 crc kubenswrapper[4749]: I1129 01:57:25.543098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cws5d" event={"ID":"85040bcc-02f5-45d8-9752-abfa6db8af6a","Type":"ContainerDied","Data":"af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7"} Nov 29 01:57:25 crc kubenswrapper[4749]: I1129 01:57:25.543139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cws5d" event={"ID":"85040bcc-02f5-45d8-9752-abfa6db8af6a","Type":"ContainerStarted","Data":"6ec136be2f5cc9635fb5e1759885d153a328711def9c294c0699317aa5083a03"} Nov 29 01:57:25 crc kubenswrapper[4749]: I1129 01:57:25.549291 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerID="5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b" exitCode=0 Nov 29 01:57:25 crc kubenswrapper[4749]: I1129 01:57:25.549382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9m7j8" event={"ID":"7cecfc05-d29e-4174-a9e5-b5dfabe17747","Type":"ContainerDied","Data":"5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b"} Nov 29 01:57:26 crc kubenswrapper[4749]: I1129 01:57:26.560456 4749 generic.go:334] "Generic (PLEG): container finished" podID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerID="626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b" exitCode=0 Nov 29 01:57:26 crc kubenswrapper[4749]: I1129 01:57:26.560570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cws5d" event={"ID":"85040bcc-02f5-45d8-9752-abfa6db8af6a","Type":"ContainerDied","Data":"626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b"} Nov 29 01:57:26 crc kubenswrapper[4749]: I1129 01:57:26.564128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9m7j8" event={"ID":"7cecfc05-d29e-4174-a9e5-b5dfabe17747","Type":"ContainerStarted","Data":"7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa"} Nov 29 01:57:26 crc kubenswrapper[4749]: I1129 01:57:26.613934 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9m7j8" podStartSLOduration=1.967052501 podStartE2EDuration="4.613911851s" podCreationTimestamp="2025-11-29 01:57:22 +0000 UTC" firstStartedPulling="2025-11-29 01:57:23.521251303 +0000 UTC m=+2786.693401190" lastFinishedPulling="2025-11-29 01:57:26.168110673 +0000 UTC m=+2789.340260540" observedRunningTime="2025-11-29 01:57:26.612048376 +0000 UTC m=+2789.784198253" watchObservedRunningTime="2025-11-29 01:57:26.613911851 +0000 UTC m=+2789.786061728" Nov 29 01:57:27 crc kubenswrapper[4749]: I1129 01:57:27.576823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cws5d" event={"ID":"85040bcc-02f5-45d8-9752-abfa6db8af6a","Type":"ContainerStarted","Data":"4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784"} Nov 29 01:57:27 crc kubenswrapper[4749]: I1129 01:57:27.606867 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cws5d" podStartSLOduration=2.915467733 podStartE2EDuration="4.606849549s" podCreationTimestamp="2025-11-29 01:57:23 +0000 UTC" firstStartedPulling="2025-11-29 01:57:25.548097336 +0000 UTC m=+2788.720247243" lastFinishedPulling="2025-11-29 01:57:27.239479162 +0000 UTC m=+2790.411629059" observedRunningTime="2025-11-29 01:57:27.600504996 +0000 UTC m=+2790.772654883" watchObservedRunningTime="2025-11-29 01:57:27.606849549 +0000 UTC m=+2790.778999406" Nov 29 01:57:32 crc kubenswrapper[4749]: I1129 01:57:32.768104 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:32 crc kubenswrapper[4749]: I1129 01:57:32.768949 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:32 crc kubenswrapper[4749]: I1129 01:57:32.845340 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:33 crc kubenswrapper[4749]: I1129 01:57:33.755546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:33 crc kubenswrapper[4749]: I1129 01:57:33.818827 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9m7j8"] Nov 29 01:57:34 crc kubenswrapper[4749]: I1129 01:57:34.118099 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:34 crc kubenswrapper[4749]: I1129 01:57:34.118177 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:34 crc kubenswrapper[4749]: I1129 01:57:34.185246 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:34 crc kubenswrapper[4749]: I1129 01:57:34.737777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:35 crc kubenswrapper[4749]: I1129 01:57:35.500833 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cws5d"] Nov 29 01:57:35 crc kubenswrapper[4749]: I1129 01:57:35.671003 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9m7j8" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerName="registry-server" containerID="cri-o://7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa" gracePeriod=2 Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.671918 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.686377 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerID="7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa" exitCode=0 Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.686461 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9m7j8" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.686472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9m7j8" event={"ID":"7cecfc05-d29e-4174-a9e5-b5dfabe17747","Type":"ContainerDied","Data":"7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa"} Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.686618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9m7j8" event={"ID":"7cecfc05-d29e-4174-a9e5-b5dfabe17747","Type":"ContainerDied","Data":"b310b8d77362e293f21642ded960d9c053350364987ded54c54d5922f5b3e0f7"} Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.686657 4749 scope.go:117] "RemoveContainer" containerID="7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.686627 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cws5d" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerName="registry-server" containerID="cri-o://4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784" gracePeriod=2 Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.717538 4749 scope.go:117] "RemoveContainer" containerID="5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.744758 4749 scope.go:117] "RemoveContainer" containerID="7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.782040 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m97d\" (UniqueName: \"kubernetes.io/projected/7cecfc05-d29e-4174-a9e5-b5dfabe17747-kube-api-access-2m97d\") pod \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.782166 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-catalog-content\") pod \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.782222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-utilities\") pod \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\" (UID: \"7cecfc05-d29e-4174-a9e5-b5dfabe17747\") " Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.787304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cecfc05-d29e-4174-a9e5-b5dfabe17747-kube-api-access-2m97d" (OuterVolumeSpecName: "kube-api-access-2m97d") pod "7cecfc05-d29e-4174-a9e5-b5dfabe17747" (UID: "7cecfc05-d29e-4174-a9e5-b5dfabe17747"). InnerVolumeSpecName "kube-api-access-2m97d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.791543 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-utilities" (OuterVolumeSpecName: "utilities") pod "7cecfc05-d29e-4174-a9e5-b5dfabe17747" (UID: "7cecfc05-d29e-4174-a9e5-b5dfabe17747"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.839484 4749 scope.go:117] "RemoveContainer" containerID="7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa" Nov 29 01:57:36 crc kubenswrapper[4749]: E1129 01:57:36.845793 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa\": container with ID starting with 7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa not found: ID does not exist" containerID="7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.846059 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa"} err="failed to get container status \"7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa\": rpc error: code = NotFound desc = could not find container \"7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa\": container with ID starting with 7f0c46e13f231c5a750f1ea0fa30dc3d8d46deb3b551f81024329a1435b53daa not found: ID does not exist" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.846206 4749 scope.go:117] "RemoveContainer" containerID="5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b" Nov 29 01:57:36 crc kubenswrapper[4749]: E1129 01:57:36.847032 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b\": container with ID starting with 5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b not found: ID does not exist" containerID="5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.847090 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b"} err="failed to get container status \"5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b\": rpc error: code = NotFound desc = could not find container \"5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b\": container with ID starting with 5501c8ed93d338cbac72440feac00b61f8e398836de7baff13b285af7e37445b not found: ID does not exist" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.847126 4749 scope.go:117] "RemoveContainer" containerID="7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.848677 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cecfc05-d29e-4174-a9e5-b5dfabe17747" (UID: "7cecfc05-d29e-4174-a9e5-b5dfabe17747"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:57:36 crc kubenswrapper[4749]: E1129 01:57:36.850749 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96\": container with ID starting with 7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96 not found: ID does not exist" containerID="7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.850799 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96"} err="failed to get container status \"7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96\": rpc error: code = NotFound desc = could not find container \"7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96\": container with ID starting with 7ad1e3f76f9f6a28d49d6bbf53ec6c5c42f7d6c4929c77026fa2bd6f0d78ef96 not found: ID does not exist" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.883793 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m97d\" (UniqueName: \"kubernetes.io/projected/7cecfc05-d29e-4174-a9e5-b5dfabe17747-kube-api-access-2m97d\") on node \"crc\" DevicePath \"\"" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.883857 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:57:36 crc kubenswrapper[4749]: I1129 01:57:36.883871 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cecfc05-d29e-4174-a9e5-b5dfabe17747-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.020951 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9m7j8"] Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.026852 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9m7j8"] Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.088146 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.092064 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" path="/var/lib/kubelet/pods/7cecfc05-d29e-4174-a9e5-b5dfabe17747/volumes" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.189398 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-utilities\") pod \"85040bcc-02f5-45d8-9752-abfa6db8af6a\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.189499 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk2sw\" (UniqueName: \"kubernetes.io/projected/85040bcc-02f5-45d8-9752-abfa6db8af6a-kube-api-access-nk2sw\") pod \"85040bcc-02f5-45d8-9752-abfa6db8af6a\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.189589 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-catalog-content\") pod \"85040bcc-02f5-45d8-9752-abfa6db8af6a\" (UID: \"85040bcc-02f5-45d8-9752-abfa6db8af6a\") " Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.190843 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-utilities" (OuterVolumeSpecName: "utilities") pod "85040bcc-02f5-45d8-9752-abfa6db8af6a" (UID: "85040bcc-02f5-45d8-9752-abfa6db8af6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.193789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85040bcc-02f5-45d8-9752-abfa6db8af6a-kube-api-access-nk2sw" (OuterVolumeSpecName: "kube-api-access-nk2sw") pod "85040bcc-02f5-45d8-9752-abfa6db8af6a" (UID: "85040bcc-02f5-45d8-9752-abfa6db8af6a"). InnerVolumeSpecName "kube-api-access-nk2sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.208474 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85040bcc-02f5-45d8-9752-abfa6db8af6a" (UID: "85040bcc-02f5-45d8-9752-abfa6db8af6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.291249 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk2sw\" (UniqueName: \"kubernetes.io/projected/85040bcc-02f5-45d8-9752-abfa6db8af6a-kube-api-access-nk2sw\") on node \"crc\" DevicePath \"\"" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.291689 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.291703 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85040bcc-02f5-45d8-9752-abfa6db8af6a-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.701521 4749 generic.go:334] "Generic (PLEG): container finished" podID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerID="4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784" exitCode=0 Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.703090 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cws5d" event={"ID":"85040bcc-02f5-45d8-9752-abfa6db8af6a","Type":"ContainerDied","Data":"4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784"} Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.703305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cws5d" event={"ID":"85040bcc-02f5-45d8-9752-abfa6db8af6a","Type":"ContainerDied","Data":"6ec136be2f5cc9635fb5e1759885d153a328711def9c294c0699317aa5083a03"} Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.703514 4749 scope.go:117] "RemoveContainer" containerID="4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.704364 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cws5d" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.742730 4749 scope.go:117] "RemoveContainer" containerID="626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.764316 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cws5d"] Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.779121 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cws5d"] Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.794713 4749 scope.go:117] "RemoveContainer" containerID="af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.830922 4749 scope.go:117] "RemoveContainer" containerID="4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784" Nov 29 01:57:37 crc kubenswrapper[4749]: E1129 01:57:37.831630 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784\": container with ID starting with 4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784 not found: ID does not exist" containerID="4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.831728 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784"} err="failed to get container status \"4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784\": rpc error: code = NotFound desc = could not find container \"4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784\": container with ID starting with 4cddc49b8eeedbdcafd6cbaf79c090e68167e859aa268d24e33eecdf2a308784 not found: ID does not exist" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.831803 4749 scope.go:117] "RemoveContainer" containerID="626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b" Nov 29 01:57:37 crc kubenswrapper[4749]: E1129 01:57:37.832504 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b\": container with ID starting with 626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b not found: ID does not exist" containerID="626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.832564 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b"} err="failed to get container status \"626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b\": rpc error: code = NotFound desc = could not find container \"626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b\": container with ID starting with 626b8753f0702e50158fe127d54337c95e7b20e092c778c6db1c842e88c6ed7b not found: ID does not exist" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.832603 4749 scope.go:117] "RemoveContainer" containerID="af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7" Nov 29 01:57:37 crc kubenswrapper[4749]: E1129 01:57:37.833296 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7\": container with ID starting with af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7 not found: ID does not exist" containerID="af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7" Nov 29 01:57:37 crc kubenswrapper[4749]: I1129 01:57:37.833432 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7"} err="failed to get container status \"af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7\": rpc error: code = NotFound desc = could not find container \"af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7\": container with ID starting with af6addfe1979b068d82ae7db03d578345468eaa4f999b3c4daf26c512021c9c7 not found: ID does not exist" Nov 29 01:57:39 crc kubenswrapper[4749]: I1129 01:57:39.093353 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" path="/var/lib/kubelet/pods/85040bcc-02f5-45d8-9752-abfa6db8af6a/volumes" Nov 29 01:58:55 crc kubenswrapper[4749]: I1129 01:58:55.374405 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:58:55 crc kubenswrapper[4749]: I1129 01:58:55.375022 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:59:25 crc kubenswrapper[4749]: I1129 01:59:25.374362 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:59:25 crc kubenswrapper[4749]: I1129 01:59:25.374934 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:59:55 crc kubenswrapper[4749]: I1129 01:59:55.374128 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 01:59:55 crc kubenswrapper[4749]: I1129 01:59:55.374790 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 01:59:55 crc kubenswrapper[4749]: I1129 01:59:55.374854 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 01:59:55 crc kubenswrapper[4749]: I1129 01:59:55.375810 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 01:59:55 crc kubenswrapper[4749]: I1129 01:59:55.375907 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" gracePeriod=600 Nov 29 01:59:55 crc kubenswrapper[4749]: E1129 01:59:55.510648 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 01:59:56 crc kubenswrapper[4749]: I1129 01:59:56.067315 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" exitCode=0 Nov 29 01:59:56 crc kubenswrapper[4749]: I1129 01:59:56.067383 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562"} Nov 29 01:59:56 crc kubenswrapper[4749]: I1129 01:59:56.067432 4749 scope.go:117] "RemoveContainer" containerID="ef519bfd651d7d6b3ad85ddd14a74f376232c09fb83b6e9d86cf00eb0a7b8112" Nov 29 01:59:56 crc kubenswrapper[4749]: I1129 01:59:56.068299 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 01:59:56 crc kubenswrapper[4749]: E1129 01:59:56.068792 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.175869 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl"] Nov 29 02:00:00 crc kubenswrapper[4749]: E1129 02:00:00.185843 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerName="extract-utilities" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.185871 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerName="extract-utilities" Nov 29 02:00:00 crc kubenswrapper[4749]: E1129 02:00:00.185890 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerName="extract-content" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.185899 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerName="extract-content" Nov 29 02:00:00 crc kubenswrapper[4749]: E1129 02:00:00.185926 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerName="extract-content" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.185934 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerName="extract-content" Nov 29 02:00:00 crc kubenswrapper[4749]: E1129 02:00:00.185950 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerName="extract-utilities" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.185959 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerName="extract-utilities" Nov 29 02:00:00 crc kubenswrapper[4749]: E1129 02:00:00.185980 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerName="registry-server" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.185990 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerName="registry-server" Nov 29 02:00:00 crc kubenswrapper[4749]: E1129 02:00:00.186003 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerName="registry-server" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.186013 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerName="registry-server" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.186249 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cecfc05-d29e-4174-a9e5-b5dfabe17747" containerName="registry-server" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.186272 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85040bcc-02f5-45d8-9752-abfa6db8af6a" containerName="registry-server" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.186807 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl"] Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.186912 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.189985 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.190851 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.225328 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hg4p\" (UniqueName: \"kubernetes.io/projected/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-kube-api-access-7hg4p\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.225394 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-secret-volume\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.225453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-config-volume\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.326988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-config-volume\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.327987 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-config-volume\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.328172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hg4p\" (UniqueName: \"kubernetes.io/projected/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-kube-api-access-7hg4p\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.328309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-secret-volume\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.340415 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-secret-volume\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.346479 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hg4p\" (UniqueName: \"kubernetes.io/projected/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-kube-api-access-7hg4p\") pod \"collect-profiles-29406360-swldl\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.505945 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:00 crc kubenswrapper[4749]: I1129 02:00:00.973543 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl"] Nov 29 02:00:00 crc kubenswrapper[4749]: W1129 02:00:00.984485 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee8d7c5e_51c6_42e3_b2ff_26c596e1ba0f.slice/crio-7164dbbb7ffb5ee5bf827de8441e9ada3799caf16f5ca97074a13887b466055a WatchSource:0}: Error finding container 7164dbbb7ffb5ee5bf827de8441e9ada3799caf16f5ca97074a13887b466055a: Status 404 returned error can't find the container with id 7164dbbb7ffb5ee5bf827de8441e9ada3799caf16f5ca97074a13887b466055a Nov 29 02:00:01 crc kubenswrapper[4749]: I1129 02:00:01.134232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" event={"ID":"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f","Type":"ContainerStarted","Data":"7164dbbb7ffb5ee5bf827de8441e9ada3799caf16f5ca97074a13887b466055a"} Nov 29 02:00:02 crc kubenswrapper[4749]: I1129 02:00:02.141701 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f" containerID="13da0910bcff0cef3358d8a28c1988d32e43c4c9bb53271d26bab119f3d2e2d9" exitCode=0 Nov 29 02:00:02 crc kubenswrapper[4749]: I1129 02:00:02.141759 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" event={"ID":"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f","Type":"ContainerDied","Data":"13da0910bcff0cef3358d8a28c1988d32e43c4c9bb53271d26bab119f3d2e2d9"} Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.421245 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.476257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hg4p\" (UniqueName: \"kubernetes.io/projected/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-kube-api-access-7hg4p\") pod \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.476378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-config-volume\") pod \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.476432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-secret-volume\") pod \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\" (UID: \"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f\") " Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.477772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-config-volume" (OuterVolumeSpecName: "config-volume") pod "ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f" (UID: "ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.482701 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f" (UID: "ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.482725 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-kube-api-access-7hg4p" (OuterVolumeSpecName: "kube-api-access-7hg4p") pod "ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f" (UID: "ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f"). InnerVolumeSpecName "kube-api-access-7hg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.577874 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hg4p\" (UniqueName: \"kubernetes.io/projected/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-kube-api-access-7hg4p\") on node \"crc\" DevicePath \"\"" Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.577924 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 02:00:03 crc kubenswrapper[4749]: I1129 02:00:03.577937 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 02:00:04 crc kubenswrapper[4749]: I1129 02:00:04.159885 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" event={"ID":"ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f","Type":"ContainerDied","Data":"7164dbbb7ffb5ee5bf827de8441e9ada3799caf16f5ca97074a13887b466055a"} Nov 29 02:00:04 crc kubenswrapper[4749]: I1129 02:00:04.159947 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7164dbbb7ffb5ee5bf827de8441e9ada3799caf16f5ca97074a13887b466055a" Nov 29 02:00:04 crc kubenswrapper[4749]: I1129 02:00:04.159951 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl" Nov 29 02:00:04 crc kubenswrapper[4749]: I1129 02:00:04.532707 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z"] Nov 29 02:00:04 crc kubenswrapper[4749]: I1129 02:00:04.539990 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406315-dh88z"] Nov 29 02:00:05 crc kubenswrapper[4749]: I1129 02:00:05.086361 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e79e70c-cf41-46f1-9df6-f13b5ff21f63" path="/var/lib/kubelet/pods/5e79e70c-cf41-46f1-9df6-f13b5ff21f63/volumes" Nov 29 02:00:10 crc kubenswrapper[4749]: I1129 02:00:10.075963 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:00:10 crc kubenswrapper[4749]: E1129 02:00:10.076546 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:00:12 crc kubenswrapper[4749]: I1129 02:00:12.301686 4749 scope.go:117] "RemoveContainer" containerID="f1fbea0bdc5f5db766193201f4381eab3ddca62809b355c227cb9566d1a14e06" Nov 29 02:00:21 crc kubenswrapper[4749]: I1129 02:00:21.076129 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:00:21 crc kubenswrapper[4749]: E1129 02:00:21.076907 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:00:34 crc kubenswrapper[4749]: I1129 02:00:34.075085 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:00:34 crc kubenswrapper[4749]: E1129 02:00:34.076103 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:00:48 crc kubenswrapper[4749]: I1129 02:00:48.075617 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:00:48 crc kubenswrapper[4749]: E1129 02:00:48.078600 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:01:00 crc kubenswrapper[4749]: I1129 02:01:00.074893 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:01:00 crc kubenswrapper[4749]: E1129 02:01:00.075911 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:01:11 crc kubenswrapper[4749]: I1129 02:01:11.075709 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:01:11 crc kubenswrapper[4749]: E1129 02:01:11.076787 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:01:22 crc kubenswrapper[4749]: I1129 02:01:22.074938 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:01:22 crc kubenswrapper[4749]: E1129 02:01:22.075726 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:01:36 crc kubenswrapper[4749]: I1129 02:01:36.074988 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:01:36 crc kubenswrapper[4749]: E1129 02:01:36.075710 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:01:51 crc kubenswrapper[4749]: I1129 02:01:51.075672 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:01:51 crc kubenswrapper[4749]: E1129 02:01:51.076420 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:02:04 crc kubenswrapper[4749]: I1129 02:02:04.074961 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:02:04 crc kubenswrapper[4749]: E1129 02:02:04.075958 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:02:18 crc kubenswrapper[4749]: I1129 02:02:18.075673 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:02:18 crc kubenswrapper[4749]: E1129 02:02:18.076693 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:02:30 crc kubenswrapper[4749]: I1129 02:02:30.075449 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:02:30 crc kubenswrapper[4749]: E1129 02:02:30.076552 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:02:43 crc kubenswrapper[4749]: I1129 02:02:43.074860 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:02:43 crc kubenswrapper[4749]: E1129 02:02:43.075695 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:02:56 crc kubenswrapper[4749]: I1129 02:02:56.075243 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:02:56 crc kubenswrapper[4749]: E1129 02:02:56.076631 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:03:10 crc kubenswrapper[4749]: I1129 02:03:10.074957 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:03:10 crc kubenswrapper[4749]: E1129 02:03:10.076375 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:03:23 crc kubenswrapper[4749]: I1129 02:03:23.075097 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:03:23 crc kubenswrapper[4749]: E1129 02:03:23.075925 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:03:35 crc kubenswrapper[4749]: I1129 02:03:35.075118 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:03:35 crc kubenswrapper[4749]: E1129 02:03:35.076147 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:03:49 crc kubenswrapper[4749]: I1129 02:03:49.075276 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:03:49 crc kubenswrapper[4749]: E1129 02:03:49.076194 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:04:03 crc kubenswrapper[4749]: I1129 02:04:03.074952 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:04:03 crc kubenswrapper[4749]: E1129 02:04:03.075981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:04:17 crc kubenswrapper[4749]: I1129 02:04:17.079616 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:04:17 crc kubenswrapper[4749]: E1129 02:04:17.083693 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:04:29 crc kubenswrapper[4749]: I1129 02:04:29.074699 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:04:29 crc kubenswrapper[4749]: E1129 02:04:29.075545 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:04:41 crc kubenswrapper[4749]: I1129 02:04:41.075107 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:04:41 crc kubenswrapper[4749]: E1129 02:04:41.076759 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:04:52 crc kubenswrapper[4749]: I1129 02:04:52.075606 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:04:52 crc kubenswrapper[4749]: E1129 02:04:52.076713 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:05:03 crc kubenswrapper[4749]: I1129 02:05:03.075597 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:05:04 crc kubenswrapper[4749]: I1129 02:05:04.342601 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"a713810679a5b214b2d8770ab6cee7b2e37c6a58a4099d80379533c36927bf09"} Nov 29 02:07:01 crc kubenswrapper[4749]: I1129 02:07:01.909277 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mxvch"] Nov 29 02:07:01 crc kubenswrapper[4749]: E1129 02:07:01.911167 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f" containerName="collect-profiles" Nov 29 02:07:01 crc kubenswrapper[4749]: I1129 02:07:01.911299 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f" containerName="collect-profiles" Nov 29 02:07:01 crc kubenswrapper[4749]: I1129 02:07:01.911587 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f" containerName="collect-profiles" Nov 29 02:07:01 crc kubenswrapper[4749]: I1129 02:07:01.912916 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:01 crc kubenswrapper[4749]: I1129 02:07:01.921998 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mxvch"] Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.090805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-utilities\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.090862 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-catalog-content\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.091232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgfgk\" (UniqueName: \"kubernetes.io/projected/17313711-29a1-4914-9ffa-166e14a831fe-kube-api-access-cgfgk\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.192969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgfgk\" (UniqueName: \"kubernetes.io/projected/17313711-29a1-4914-9ffa-166e14a831fe-kube-api-access-cgfgk\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.193049 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-utilities\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.193092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-catalog-content\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.193663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-catalog-content\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.193792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-utilities\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.213759 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgfgk\" (UniqueName: \"kubernetes.io/projected/17313711-29a1-4914-9ffa-166e14a831fe-kube-api-access-cgfgk\") pod \"certified-operators-mxvch\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.248054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:02 crc kubenswrapper[4749]: I1129 02:07:02.684945 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mxvch"] Nov 29 02:07:03 crc kubenswrapper[4749]: I1129 02:07:03.410460 4749 generic.go:334] "Generic (PLEG): container finished" podID="17313711-29a1-4914-9ffa-166e14a831fe" containerID="f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658" exitCode=0 Nov 29 02:07:03 crc kubenswrapper[4749]: I1129 02:07:03.410567 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxvch" event={"ID":"17313711-29a1-4914-9ffa-166e14a831fe","Type":"ContainerDied","Data":"f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658"} Nov 29 02:07:03 crc kubenswrapper[4749]: I1129 02:07:03.410937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxvch" event={"ID":"17313711-29a1-4914-9ffa-166e14a831fe","Type":"ContainerStarted","Data":"88df57fda5342d5e346202b925ecd469131a8d14a4df0d1ffd60b72cfa38cde7"} Nov 29 02:07:03 crc kubenswrapper[4749]: I1129 02:07:03.414528 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 02:07:04 crc kubenswrapper[4749]: I1129 02:07:04.908910 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wns88"] Nov 29 02:07:04 crc kubenswrapper[4749]: I1129 02:07:04.914963 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:04 crc kubenswrapper[4749]: I1129 02:07:04.927568 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wns88"] Nov 29 02:07:04 crc kubenswrapper[4749]: I1129 02:07:04.937131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntdqh\" (UniqueName: \"kubernetes.io/projected/d3d23790-a765-4719-ad05-96977d8ded07-kube-api-access-ntdqh\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:04 crc kubenswrapper[4749]: I1129 02:07:04.937420 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-utilities\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:04 crc kubenswrapper[4749]: I1129 02:07:04.937574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-catalog-content\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.039453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-catalog-content\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.039575 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntdqh\" (UniqueName: \"kubernetes.io/projected/d3d23790-a765-4719-ad05-96977d8ded07-kube-api-access-ntdqh\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.039713 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-utilities\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.040346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-catalog-content\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.040439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-utilities\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.073753 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntdqh\" (UniqueName: \"kubernetes.io/projected/d3d23790-a765-4719-ad05-96977d8ded07-kube-api-access-ntdqh\") pod \"redhat-operators-wns88\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.260919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.430157 4749 generic.go:334] "Generic (PLEG): container finished" podID="17313711-29a1-4914-9ffa-166e14a831fe" containerID="15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf" exitCode=0 Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.430397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxvch" event={"ID":"17313711-29a1-4914-9ffa-166e14a831fe","Type":"ContainerDied","Data":"15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf"} Nov 29 02:07:05 crc kubenswrapper[4749]: I1129 02:07:05.742907 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wns88"] Nov 29 02:07:05 crc kubenswrapper[4749]: W1129 02:07:05.753021 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3d23790_a765_4719_ad05_96977d8ded07.slice/crio-99ca12c7c15e7156455fddc43196b51136d9ca6ba919730cff80fedaf0c3e377 WatchSource:0}: Error finding container 99ca12c7c15e7156455fddc43196b51136d9ca6ba919730cff80fedaf0c3e377: Status 404 returned error can't find the container with id 99ca12c7c15e7156455fddc43196b51136d9ca6ba919730cff80fedaf0c3e377 Nov 29 02:07:06 crc kubenswrapper[4749]: I1129 02:07:06.441996 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3d23790-a765-4719-ad05-96977d8ded07" containerID="87e683cfde956da9a7243499e146988aa7687841f7c820323c6f91557d959b1d" exitCode=0 Nov 29 02:07:06 crc kubenswrapper[4749]: I1129 02:07:06.442511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wns88" event={"ID":"d3d23790-a765-4719-ad05-96977d8ded07","Type":"ContainerDied","Data":"87e683cfde956da9a7243499e146988aa7687841f7c820323c6f91557d959b1d"} Nov 29 02:07:06 crc kubenswrapper[4749]: I1129 02:07:06.442550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wns88" event={"ID":"d3d23790-a765-4719-ad05-96977d8ded07","Type":"ContainerStarted","Data":"99ca12c7c15e7156455fddc43196b51136d9ca6ba919730cff80fedaf0c3e377"} Nov 29 02:07:06 crc kubenswrapper[4749]: I1129 02:07:06.449081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxvch" event={"ID":"17313711-29a1-4914-9ffa-166e14a831fe","Type":"ContainerStarted","Data":"f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e"} Nov 29 02:07:06 crc kubenswrapper[4749]: I1129 02:07:06.489870 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mxvch" podStartSLOduration=2.99915202 podStartE2EDuration="5.489844117s" podCreationTimestamp="2025-11-29 02:07:01 +0000 UTC" firstStartedPulling="2025-11-29 02:07:03.413714445 +0000 UTC m=+3366.585864342" lastFinishedPulling="2025-11-29 02:07:05.904406582 +0000 UTC m=+3369.076556439" observedRunningTime="2025-11-29 02:07:06.486981238 +0000 UTC m=+3369.659131155" watchObservedRunningTime="2025-11-29 02:07:06.489844117 +0000 UTC m=+3369.661994014" Nov 29 02:07:07 crc kubenswrapper[4749]: I1129 02:07:07.463548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wns88" event={"ID":"d3d23790-a765-4719-ad05-96977d8ded07","Type":"ContainerStarted","Data":"3b8f1eae1b16fddc566c593384bbf1d5e6bc6ecd8195eb4523128aa79533b02e"} Nov 29 02:07:08 crc kubenswrapper[4749]: I1129 02:07:08.475011 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3d23790-a765-4719-ad05-96977d8ded07" containerID="3b8f1eae1b16fddc566c593384bbf1d5e6bc6ecd8195eb4523128aa79533b02e" exitCode=0 Nov 29 02:07:08 crc kubenswrapper[4749]: I1129 02:07:08.475114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wns88" event={"ID":"d3d23790-a765-4719-ad05-96977d8ded07","Type":"ContainerDied","Data":"3b8f1eae1b16fddc566c593384bbf1d5e6bc6ecd8195eb4523128aa79533b02e"} Nov 29 02:07:09 crc kubenswrapper[4749]: I1129 02:07:09.485685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wns88" event={"ID":"d3d23790-a765-4719-ad05-96977d8ded07","Type":"ContainerStarted","Data":"fc2c4546e0203e46e27d783e289b0a32aedf84e36f68cb6731121d9dda22886b"} Nov 29 02:07:09 crc kubenswrapper[4749]: I1129 02:07:09.515662 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wns88" podStartSLOduration=2.939415518 podStartE2EDuration="5.515634679s" podCreationTimestamp="2025-11-29 02:07:04 +0000 UTC" firstStartedPulling="2025-11-29 02:07:06.446050785 +0000 UTC m=+3369.618200682" lastFinishedPulling="2025-11-29 02:07:09.022269946 +0000 UTC m=+3372.194419843" observedRunningTime="2025-11-29 02:07:09.509587604 +0000 UTC m=+3372.681737501" watchObservedRunningTime="2025-11-29 02:07:09.515634679 +0000 UTC m=+3372.687784546" Nov 29 02:07:12 crc kubenswrapper[4749]: I1129 02:07:12.249255 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:12 crc kubenswrapper[4749]: I1129 02:07:12.250932 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:12 crc kubenswrapper[4749]: I1129 02:07:12.331697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:12 crc kubenswrapper[4749]: I1129 02:07:12.572189 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:13 crc kubenswrapper[4749]: I1129 02:07:13.290094 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mxvch"] Nov 29 02:07:14 crc kubenswrapper[4749]: I1129 02:07:14.530055 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mxvch" podUID="17313711-29a1-4914-9ffa-166e14a831fe" containerName="registry-server" containerID="cri-o://f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e" gracePeriod=2 Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.018928 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.094304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgfgk\" (UniqueName: \"kubernetes.io/projected/17313711-29a1-4914-9ffa-166e14a831fe-kube-api-access-cgfgk\") pod \"17313711-29a1-4914-9ffa-166e14a831fe\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.094438 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-utilities\") pod \"17313711-29a1-4914-9ffa-166e14a831fe\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.094497 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-catalog-content\") pod \"17313711-29a1-4914-9ffa-166e14a831fe\" (UID: \"17313711-29a1-4914-9ffa-166e14a831fe\") " Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.095524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-utilities" (OuterVolumeSpecName: "utilities") pod "17313711-29a1-4914-9ffa-166e14a831fe" (UID: "17313711-29a1-4914-9ffa-166e14a831fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.104745 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17313711-29a1-4914-9ffa-166e14a831fe-kube-api-access-cgfgk" (OuterVolumeSpecName: "kube-api-access-cgfgk") pod "17313711-29a1-4914-9ffa-166e14a831fe" (UID: "17313711-29a1-4914-9ffa-166e14a831fe"). InnerVolumeSpecName "kube-api-access-cgfgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.195523 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgfgk\" (UniqueName: \"kubernetes.io/projected/17313711-29a1-4914-9ffa-166e14a831fe-kube-api-access-cgfgk\") on node \"crc\" DevicePath \"\"" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.195819 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.261758 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.262337 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.396960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17313711-29a1-4914-9ffa-166e14a831fe" (UID: "17313711-29a1-4914-9ffa-166e14a831fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.398841 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17313711-29a1-4914-9ffa-166e14a831fe-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.542342 4749 generic.go:334] "Generic (PLEG): container finished" podID="17313711-29a1-4914-9ffa-166e14a831fe" containerID="f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e" exitCode=0 Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.542438 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mxvch" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.542458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxvch" event={"ID":"17313711-29a1-4914-9ffa-166e14a831fe","Type":"ContainerDied","Data":"f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e"} Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.542559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mxvch" event={"ID":"17313711-29a1-4914-9ffa-166e14a831fe","Type":"ContainerDied","Data":"88df57fda5342d5e346202b925ecd469131a8d14a4df0d1ffd60b72cfa38cde7"} Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.542598 4749 scope.go:117] "RemoveContainer" containerID="f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.566994 4749 scope.go:117] "RemoveContainer" containerID="15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.597157 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mxvch"] Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.601298 4749 scope.go:117] "RemoveContainer" containerID="f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.605579 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mxvch"] Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.626051 4749 scope.go:117] "RemoveContainer" containerID="f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e" Nov 29 02:07:15 crc kubenswrapper[4749]: E1129 02:07:15.626656 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e\": container with ID starting with f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e not found: ID does not exist" containerID="f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.626688 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e"} err="failed to get container status \"f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e\": rpc error: code = NotFound desc = could not find container \"f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e\": container with ID starting with f38eda84bedafaaccac548da21fce5fce3595fa33f48a2702d436a6a5591450e not found: ID does not exist" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.626710 4749 scope.go:117] "RemoveContainer" containerID="15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf" Nov 29 02:07:15 crc kubenswrapper[4749]: E1129 02:07:15.627018 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf\": container with ID starting with 15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf not found: ID does not exist" containerID="15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.627043 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf"} err="failed to get container status \"15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf\": rpc error: code = NotFound desc = could not find container \"15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf\": container with ID starting with 15e9db3789e61e818c7e9ac541ad8c0fbdbb72d4ae239fb0bb0d1d800e90efaf not found: ID does not exist" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.627055 4749 scope.go:117] "RemoveContainer" containerID="f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658" Nov 29 02:07:15 crc kubenswrapper[4749]: E1129 02:07:15.628002 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658\": container with ID starting with f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658 not found: ID does not exist" containerID="f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658" Nov 29 02:07:15 crc kubenswrapper[4749]: I1129 02:07:15.628083 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658"} err="failed to get container status \"f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658\": rpc error: code = NotFound desc = could not find container \"f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658\": container with ID starting with f6a49f294f298e172dae26ab18d3ac66f43560004d348f9cc84d8612d1f36658 not found: ID does not exist" Nov 29 02:07:16 crc kubenswrapper[4749]: I1129 02:07:16.342834 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wns88" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="registry-server" probeResult="failure" output=< Nov 29 02:07:16 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 02:07:16 crc kubenswrapper[4749]: > Nov 29 02:07:17 crc kubenswrapper[4749]: I1129 02:07:17.100153 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17313711-29a1-4914-9ffa-166e14a831fe" path="/var/lib/kubelet/pods/17313711-29a1-4914-9ffa-166e14a831fe/volumes" Nov 29 02:07:25 crc kubenswrapper[4749]: I1129 02:07:25.339609 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:25 crc kubenswrapper[4749]: I1129 02:07:25.375564 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:07:25 crc kubenswrapper[4749]: I1129 02:07:25.376807 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:07:25 crc kubenswrapper[4749]: I1129 02:07:25.423834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:25 crc kubenswrapper[4749]: I1129 02:07:25.591168 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wns88"] Nov 29 02:07:26 crc kubenswrapper[4749]: I1129 02:07:26.645595 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wns88" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="registry-server" containerID="cri-o://fc2c4546e0203e46e27d783e289b0a32aedf84e36f68cb6731121d9dda22886b" gracePeriod=2 Nov 29 02:07:27 crc kubenswrapper[4749]: I1129 02:07:27.655767 4749 generic.go:334] "Generic (PLEG): container finished" podID="d3d23790-a765-4719-ad05-96977d8ded07" containerID="fc2c4546e0203e46e27d783e289b0a32aedf84e36f68cb6731121d9dda22886b" exitCode=0 Nov 29 02:07:27 crc kubenswrapper[4749]: I1129 02:07:27.655836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wns88" event={"ID":"d3d23790-a765-4719-ad05-96977d8ded07","Type":"ContainerDied","Data":"fc2c4546e0203e46e27d783e289b0a32aedf84e36f68cb6731121d9dda22886b"} Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.301104 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.411008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-catalog-content\") pod \"d3d23790-a765-4719-ad05-96977d8ded07\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.411078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-utilities\") pod \"d3d23790-a765-4719-ad05-96977d8ded07\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.411128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntdqh\" (UniqueName: \"kubernetes.io/projected/d3d23790-a765-4719-ad05-96977d8ded07-kube-api-access-ntdqh\") pod \"d3d23790-a765-4719-ad05-96977d8ded07\" (UID: \"d3d23790-a765-4719-ad05-96977d8ded07\") " Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.412656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-utilities" (OuterVolumeSpecName: "utilities") pod "d3d23790-a765-4719-ad05-96977d8ded07" (UID: "d3d23790-a765-4719-ad05-96977d8ded07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.421801 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d23790-a765-4719-ad05-96977d8ded07-kube-api-access-ntdqh" (OuterVolumeSpecName: "kube-api-access-ntdqh") pod "d3d23790-a765-4719-ad05-96977d8ded07" (UID: "d3d23790-a765-4719-ad05-96977d8ded07"). InnerVolumeSpecName "kube-api-access-ntdqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.512940 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.512982 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntdqh\" (UniqueName: \"kubernetes.io/projected/d3d23790-a765-4719-ad05-96977d8ded07-kube-api-access-ntdqh\") on node \"crc\" DevicePath \"\"" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.532744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3d23790-a765-4719-ad05-96977d8ded07" (UID: "d3d23790-a765-4719-ad05-96977d8ded07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.614397 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3d23790-a765-4719-ad05-96977d8ded07-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.673924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wns88" event={"ID":"d3d23790-a765-4719-ad05-96977d8ded07","Type":"ContainerDied","Data":"99ca12c7c15e7156455fddc43196b51136d9ca6ba919730cff80fedaf0c3e377"} Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.674005 4749 scope.go:117] "RemoveContainer" containerID="fc2c4546e0203e46e27d783e289b0a32aedf84e36f68cb6731121d9dda22886b" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.674137 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wns88" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.718995 4749 scope.go:117] "RemoveContainer" containerID="3b8f1eae1b16fddc566c593384bbf1d5e6bc6ecd8195eb4523128aa79533b02e" Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.741971 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wns88"] Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.757322 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wns88"] Nov 29 02:07:28 crc kubenswrapper[4749]: I1129 02:07:28.760911 4749 scope.go:117] "RemoveContainer" containerID="87e683cfde956da9a7243499e146988aa7687841f7c820323c6f91557d959b1d" Nov 29 02:07:29 crc kubenswrapper[4749]: I1129 02:07:29.090493 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d23790-a765-4719-ad05-96977d8ded07" path="/var/lib/kubelet/pods/d3d23790-a765-4719-ad05-96977d8ded07/volumes" Nov 29 02:07:55 crc kubenswrapper[4749]: I1129 02:07:55.374334 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:07:55 crc kubenswrapper[4749]: I1129 02:07:55.375083 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:08:25 crc kubenswrapper[4749]: I1129 02:08:25.374459 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:08:25 crc kubenswrapper[4749]: I1129 02:08:25.375266 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:08:25 crc kubenswrapper[4749]: I1129 02:08:25.375334 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:08:25 crc kubenswrapper[4749]: I1129 02:08:25.376285 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a713810679a5b214b2d8770ab6cee7b2e37c6a58a4099d80379533c36927bf09"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:08:25 crc kubenswrapper[4749]: I1129 02:08:25.376382 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://a713810679a5b214b2d8770ab6cee7b2e37c6a58a4099d80379533c36927bf09" gracePeriod=600 Nov 29 02:08:26 crc kubenswrapper[4749]: I1129 02:08:26.259308 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="a713810679a5b214b2d8770ab6cee7b2e37c6a58a4099d80379533c36927bf09" exitCode=0 Nov 29 02:08:26 crc kubenswrapper[4749]: I1129 02:08:26.259842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"a713810679a5b214b2d8770ab6cee7b2e37c6a58a4099d80379533c36927bf09"} Nov 29 02:08:26 crc kubenswrapper[4749]: I1129 02:08:26.259891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e"} Nov 29 02:08:26 crc kubenswrapper[4749]: I1129 02:08:26.259928 4749 scope.go:117] "RemoveContainer" containerID="aea41a32541d7b1e04fc4391f55ec5c732f62e2b195ec4f38e7d490e3c0cb562" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.801330 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fd7ln"] Nov 29 02:08:40 crc kubenswrapper[4749]: E1129 02:08:40.802607 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17313711-29a1-4914-9ffa-166e14a831fe" containerName="extract-utilities" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.802624 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="17313711-29a1-4914-9ffa-166e14a831fe" containerName="extract-utilities" Nov 29 02:08:40 crc kubenswrapper[4749]: E1129 02:08:40.802640 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17313711-29a1-4914-9ffa-166e14a831fe" containerName="extract-content" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.802647 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="17313711-29a1-4914-9ffa-166e14a831fe" containerName="extract-content" Nov 29 02:08:40 crc kubenswrapper[4749]: E1129 02:08:40.802657 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="registry-server" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.802664 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="registry-server" Nov 29 02:08:40 crc kubenswrapper[4749]: E1129 02:08:40.802685 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17313711-29a1-4914-9ffa-166e14a831fe" containerName="registry-server" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.802693 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="17313711-29a1-4914-9ffa-166e14a831fe" containerName="registry-server" Nov 29 02:08:40 crc kubenswrapper[4749]: E1129 02:08:40.802714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="extract-content" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.802722 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="extract-content" Nov 29 02:08:40 crc kubenswrapper[4749]: E1129 02:08:40.802737 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="extract-utilities" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.802745 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="extract-utilities" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.802941 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="17313711-29a1-4914-9ffa-166e14a831fe" containerName="registry-server" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.802967 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d23790-a765-4719-ad05-96977d8ded07" containerName="registry-server" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.805230 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.812055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fd7ln"] Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.964387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-catalog-content\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.964454 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-utilities\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:40 crc kubenswrapper[4749]: I1129 02:08:40.964570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzb4\" (UniqueName: \"kubernetes.io/projected/f5c25468-58fa-4c3d-aa96-63308edecd2b-kube-api-access-pbzb4\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:41 crc kubenswrapper[4749]: I1129 02:08:41.066051 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-catalog-content\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:41 crc kubenswrapper[4749]: I1129 02:08:41.066106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-utilities\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:41 crc kubenswrapper[4749]: I1129 02:08:41.066174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzb4\" (UniqueName: \"kubernetes.io/projected/f5c25468-58fa-4c3d-aa96-63308edecd2b-kube-api-access-pbzb4\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:41 crc kubenswrapper[4749]: I1129 02:08:41.066643 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-catalog-content\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:41 crc kubenswrapper[4749]: I1129 02:08:41.066957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-utilities\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:41 crc kubenswrapper[4749]: I1129 02:08:41.098411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzb4\" (UniqueName: \"kubernetes.io/projected/f5c25468-58fa-4c3d-aa96-63308edecd2b-kube-api-access-pbzb4\") pod \"community-operators-fd7ln\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:41 crc kubenswrapper[4749]: I1129 02:08:41.134209 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:41 crc kubenswrapper[4749]: I1129 02:08:41.405432 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fd7ln"] Nov 29 02:08:42 crc kubenswrapper[4749]: I1129 02:08:42.426826 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerID="29efe8de0077adb8e6858b222564fdf16498333ee018ae3a94fde823f5aa4f3e" exitCode=0 Nov 29 02:08:42 crc kubenswrapper[4749]: I1129 02:08:42.426896 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7ln" event={"ID":"f5c25468-58fa-4c3d-aa96-63308edecd2b","Type":"ContainerDied","Data":"29efe8de0077adb8e6858b222564fdf16498333ee018ae3a94fde823f5aa4f3e"} Nov 29 02:08:42 crc kubenswrapper[4749]: I1129 02:08:42.426940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7ln" event={"ID":"f5c25468-58fa-4c3d-aa96-63308edecd2b","Type":"ContainerStarted","Data":"5cdecb975c8484a48da06a460bae539a7a1e0c5f5f1d4553a4ad9fdf517020b8"} Nov 29 02:08:44 crc kubenswrapper[4749]: I1129 02:08:44.452343 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerID="226f7d44cabdda398a5df9f4cc9233bb60e439c89491d6e42be5f21938b3d4b8" exitCode=0 Nov 29 02:08:44 crc kubenswrapper[4749]: I1129 02:08:44.452446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7ln" event={"ID":"f5c25468-58fa-4c3d-aa96-63308edecd2b","Type":"ContainerDied","Data":"226f7d44cabdda398a5df9f4cc9233bb60e439c89491d6e42be5f21938b3d4b8"} Nov 29 02:08:45 crc kubenswrapper[4749]: I1129 02:08:45.470142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7ln" event={"ID":"f5c25468-58fa-4c3d-aa96-63308edecd2b","Type":"ContainerStarted","Data":"8aac2a3482e61fd9b5af919c0f0f97d6d785618354fa037988fed03102c87b87"} Nov 29 02:08:45 crc kubenswrapper[4749]: I1129 02:08:45.504871 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fd7ln" podStartSLOduration=3.002132223 podStartE2EDuration="5.504846039s" podCreationTimestamp="2025-11-29 02:08:40 +0000 UTC" firstStartedPulling="2025-11-29 02:08:42.429865604 +0000 UTC m=+3465.602015491" lastFinishedPulling="2025-11-29 02:08:44.93257942 +0000 UTC m=+3468.104729307" observedRunningTime="2025-11-29 02:08:45.50117121 +0000 UTC m=+3468.673321157" watchObservedRunningTime="2025-11-29 02:08:45.504846039 +0000 UTC m=+3468.676995936" Nov 29 02:08:51 crc kubenswrapper[4749]: I1129 02:08:51.134670 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:51 crc kubenswrapper[4749]: I1129 02:08:51.135172 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:51 crc kubenswrapper[4749]: I1129 02:08:51.200423 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:51 crc kubenswrapper[4749]: I1129 02:08:51.647019 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:51 crc kubenswrapper[4749]: I1129 02:08:51.690910 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fd7ln"] Nov 29 02:08:53 crc kubenswrapper[4749]: I1129 02:08:53.548874 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fd7ln" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerName="registry-server" containerID="cri-o://8aac2a3482e61fd9b5af919c0f0f97d6d785618354fa037988fed03102c87b87" gracePeriod=2 Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.561383 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerID="8aac2a3482e61fd9b5af919c0f0f97d6d785618354fa037988fed03102c87b87" exitCode=0 Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.561438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7ln" event={"ID":"f5c25468-58fa-4c3d-aa96-63308edecd2b","Type":"ContainerDied","Data":"8aac2a3482e61fd9b5af919c0f0f97d6d785618354fa037988fed03102c87b87"} Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.770478 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.946499 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-catalog-content\") pod \"f5c25468-58fa-4c3d-aa96-63308edecd2b\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.946636 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-utilities\") pod \"f5c25468-58fa-4c3d-aa96-63308edecd2b\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.946677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbzb4\" (UniqueName: \"kubernetes.io/projected/f5c25468-58fa-4c3d-aa96-63308edecd2b-kube-api-access-pbzb4\") pod \"f5c25468-58fa-4c3d-aa96-63308edecd2b\" (UID: \"f5c25468-58fa-4c3d-aa96-63308edecd2b\") " Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.947725 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-utilities" (OuterVolumeSpecName: "utilities") pod "f5c25468-58fa-4c3d-aa96-63308edecd2b" (UID: "f5c25468-58fa-4c3d-aa96-63308edecd2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.953720 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c25468-58fa-4c3d-aa96-63308edecd2b-kube-api-access-pbzb4" (OuterVolumeSpecName: "kube-api-access-pbzb4") pod "f5c25468-58fa-4c3d-aa96-63308edecd2b" (UID: "f5c25468-58fa-4c3d-aa96-63308edecd2b"). InnerVolumeSpecName "kube-api-access-pbzb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:08:54 crc kubenswrapper[4749]: I1129 02:08:54.995786 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5c25468-58fa-4c3d-aa96-63308edecd2b" (UID: "f5c25468-58fa-4c3d-aa96-63308edecd2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.048397 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.048432 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c25468-58fa-4c3d-aa96-63308edecd2b-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.048442 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbzb4\" (UniqueName: \"kubernetes.io/projected/f5c25468-58fa-4c3d-aa96-63308edecd2b-kube-api-access-pbzb4\") on node \"crc\" DevicePath \"\"" Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.574966 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fd7ln" event={"ID":"f5c25468-58fa-4c3d-aa96-63308edecd2b","Type":"ContainerDied","Data":"5cdecb975c8484a48da06a460bae539a7a1e0c5f5f1d4553a4ad9fdf517020b8"} Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.575080 4749 scope.go:117] "RemoveContainer" containerID="8aac2a3482e61fd9b5af919c0f0f97d6d785618354fa037988fed03102c87b87" Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.575095 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fd7ln" Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.617138 4749 scope.go:117] "RemoveContainer" containerID="226f7d44cabdda398a5df9f4cc9233bb60e439c89491d6e42be5f21938b3d4b8" Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.618504 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fd7ln"] Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.633052 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fd7ln"] Nov 29 02:08:55 crc kubenswrapper[4749]: I1129 02:08:55.665760 4749 scope.go:117] "RemoveContainer" containerID="29efe8de0077adb8e6858b222564fdf16498333ee018ae3a94fde823f5aa4f3e" Nov 29 02:08:57 crc kubenswrapper[4749]: I1129 02:08:57.091659 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" path="/var/lib/kubelet/pods/f5c25468-58fa-4c3d-aa96-63308edecd2b/volumes" Nov 29 02:10:25 crc kubenswrapper[4749]: I1129 02:10:25.374303 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:10:25 crc kubenswrapper[4749]: I1129 02:10:25.374908 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:10:55 crc kubenswrapper[4749]: I1129 02:10:55.374541 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:10:55 crc kubenswrapper[4749]: I1129 02:10:55.376327 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.129674 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ztl52"] Nov 29 02:11:01 crc kubenswrapper[4749]: E1129 02:11:01.130176 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerName="registry-server" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.130187 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerName="registry-server" Nov 29 02:11:01 crc kubenswrapper[4749]: E1129 02:11:01.130226 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerName="extract-utilities" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.130233 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerName="extract-utilities" Nov 29 02:11:01 crc kubenswrapper[4749]: E1129 02:11:01.130248 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerName="extract-content" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.130255 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerName="extract-content" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.130379 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c25468-58fa-4c3d-aa96-63308edecd2b" containerName="registry-server" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.131264 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.133616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-catalog-content\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.133645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-utilities\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.133991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd5lb\" (UniqueName: \"kubernetes.io/projected/fbf254b4-6f58-42f9-9740-fd6787f1a59f-kube-api-access-vd5lb\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.149393 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztl52"] Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.234911 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-catalog-content\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.234975 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-utilities\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.235266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd5lb\" (UniqueName: \"kubernetes.io/projected/fbf254b4-6f58-42f9-9740-fd6787f1a59f-kube-api-access-vd5lb\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.235524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-utilities\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.235560 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-catalog-content\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.257769 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd5lb\" (UniqueName: \"kubernetes.io/projected/fbf254b4-6f58-42f9-9740-fd6787f1a59f-kube-api-access-vd5lb\") pod \"redhat-marketplace-ztl52\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.451786 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.709450 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztl52"] Nov 29 02:11:01 crc kubenswrapper[4749]: I1129 02:11:01.768943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztl52" event={"ID":"fbf254b4-6f58-42f9-9740-fd6787f1a59f","Type":"ContainerStarted","Data":"c8c0256b945a3b50b2c72e76df6d823ac2110d47e30fdac02e21869fea09a30d"} Nov 29 02:11:02 crc kubenswrapper[4749]: I1129 02:11:02.782076 4749 generic.go:334] "Generic (PLEG): container finished" podID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerID="5219ec56b253b21bd2d3e889d5aa9a57db47f3e7775bf1524b23548d2b1285c2" exitCode=0 Nov 29 02:11:02 crc kubenswrapper[4749]: I1129 02:11:02.782147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztl52" event={"ID":"fbf254b4-6f58-42f9-9740-fd6787f1a59f","Type":"ContainerDied","Data":"5219ec56b253b21bd2d3e889d5aa9a57db47f3e7775bf1524b23548d2b1285c2"} Nov 29 02:11:03 crc kubenswrapper[4749]: I1129 02:11:03.791763 4749 generic.go:334] "Generic (PLEG): container finished" podID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerID="b303afe00fb2c1fae69092bf334baa2d148390397ac01ed4ceadc9d421f7e498" exitCode=0 Nov 29 02:11:03 crc kubenswrapper[4749]: I1129 02:11:03.791871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztl52" event={"ID":"fbf254b4-6f58-42f9-9740-fd6787f1a59f","Type":"ContainerDied","Data":"b303afe00fb2c1fae69092bf334baa2d148390397ac01ed4ceadc9d421f7e498"} Nov 29 02:11:04 crc kubenswrapper[4749]: I1129 02:11:04.806256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztl52" event={"ID":"fbf254b4-6f58-42f9-9740-fd6787f1a59f","Type":"ContainerStarted","Data":"9d72709185eefeb3729834b991042ee65389e90b2d9b2b6ca6eb4fc38a62767f"} Nov 29 02:11:04 crc kubenswrapper[4749]: I1129 02:11:04.826190 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ztl52" podStartSLOduration=2.20220817 podStartE2EDuration="3.826169127s" podCreationTimestamp="2025-11-29 02:11:01 +0000 UTC" firstStartedPulling="2025-11-29 02:11:02.785709752 +0000 UTC m=+3605.957859639" lastFinishedPulling="2025-11-29 02:11:04.409670729 +0000 UTC m=+3607.581820596" observedRunningTime="2025-11-29 02:11:04.821659009 +0000 UTC m=+3607.993808946" watchObservedRunningTime="2025-11-29 02:11:04.826169127 +0000 UTC m=+3607.998318994" Nov 29 02:11:11 crc kubenswrapper[4749]: I1129 02:11:11.452602 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:11 crc kubenswrapper[4749]: I1129 02:11:11.453217 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:11 crc kubenswrapper[4749]: I1129 02:11:11.526411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:11 crc kubenswrapper[4749]: I1129 02:11:11.933857 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:12 crc kubenswrapper[4749]: I1129 02:11:12.007877 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztl52"] Nov 29 02:11:13 crc kubenswrapper[4749]: I1129 02:11:13.888590 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ztl52" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerName="registry-server" containerID="cri-o://9d72709185eefeb3729834b991042ee65389e90b2d9b2b6ca6eb4fc38a62767f" gracePeriod=2 Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.899566 4749 generic.go:334] "Generic (PLEG): container finished" podID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerID="9d72709185eefeb3729834b991042ee65389e90b2d9b2b6ca6eb4fc38a62767f" exitCode=0 Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.899657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztl52" event={"ID":"fbf254b4-6f58-42f9-9740-fd6787f1a59f","Type":"ContainerDied","Data":"9d72709185eefeb3729834b991042ee65389e90b2d9b2b6ca6eb4fc38a62767f"} Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.899835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztl52" event={"ID":"fbf254b4-6f58-42f9-9740-fd6787f1a59f","Type":"ContainerDied","Data":"c8c0256b945a3b50b2c72e76df6d823ac2110d47e30fdac02e21869fea09a30d"} Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.899855 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8c0256b945a3b50b2c72e76df6d823ac2110d47e30fdac02e21869fea09a30d" Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.915809 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.953527 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd5lb\" (UniqueName: \"kubernetes.io/projected/fbf254b4-6f58-42f9-9740-fd6787f1a59f-kube-api-access-vd5lb\") pod \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.953595 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-catalog-content\") pod \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.953680 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-utilities\") pod \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\" (UID: \"fbf254b4-6f58-42f9-9740-fd6787f1a59f\") " Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.963016 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-utilities" (OuterVolumeSpecName: "utilities") pod "fbf254b4-6f58-42f9-9740-fd6787f1a59f" (UID: "fbf254b4-6f58-42f9-9740-fd6787f1a59f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.963789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbf254b4-6f58-42f9-9740-fd6787f1a59f-kube-api-access-vd5lb" (OuterVolumeSpecName: "kube-api-access-vd5lb") pod "fbf254b4-6f58-42f9-9740-fd6787f1a59f" (UID: "fbf254b4-6f58-42f9-9740-fd6787f1a59f"). InnerVolumeSpecName "kube-api-access-vd5lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:11:14 crc kubenswrapper[4749]: I1129 02:11:14.981421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbf254b4-6f58-42f9-9740-fd6787f1a59f" (UID: "fbf254b4-6f58-42f9-9740-fd6787f1a59f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:11:15 crc kubenswrapper[4749]: I1129 02:11:15.055485 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd5lb\" (UniqueName: \"kubernetes.io/projected/fbf254b4-6f58-42f9-9740-fd6787f1a59f-kube-api-access-vd5lb\") on node \"crc\" DevicePath \"\"" Nov 29 02:11:15 crc kubenswrapper[4749]: I1129 02:11:15.055786 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:11:15 crc kubenswrapper[4749]: I1129 02:11:15.055807 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbf254b4-6f58-42f9-9740-fd6787f1a59f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:11:15 crc kubenswrapper[4749]: I1129 02:11:15.910585 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztl52" Nov 29 02:11:15 crc kubenswrapper[4749]: I1129 02:11:15.946849 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztl52"] Nov 29 02:11:15 crc kubenswrapper[4749]: I1129 02:11:15.955579 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztl52"] Nov 29 02:11:17 crc kubenswrapper[4749]: I1129 02:11:17.093022 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" path="/var/lib/kubelet/pods/fbf254b4-6f58-42f9-9740-fd6787f1a59f/volumes" Nov 29 02:11:25 crc kubenswrapper[4749]: I1129 02:11:25.374685 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:11:25 crc kubenswrapper[4749]: I1129 02:11:25.375345 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:11:25 crc kubenswrapper[4749]: I1129 02:11:25.375409 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:11:25 crc kubenswrapper[4749]: I1129 02:11:25.376295 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:11:25 crc kubenswrapper[4749]: I1129 02:11:25.376389 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" gracePeriod=600 Nov 29 02:11:25 crc kubenswrapper[4749]: E1129 02:11:25.525126 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:11:26 crc kubenswrapper[4749]: I1129 02:11:26.003665 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" exitCode=0 Nov 29 02:11:26 crc kubenswrapper[4749]: I1129 02:11:26.003729 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e"} Nov 29 02:11:26 crc kubenswrapper[4749]: I1129 02:11:26.004022 4749 scope.go:117] "RemoveContainer" containerID="a713810679a5b214b2d8770ab6cee7b2e37c6a58a4099d80379533c36927bf09" Nov 29 02:11:26 crc kubenswrapper[4749]: I1129 02:11:26.004605 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:11:26 crc kubenswrapper[4749]: E1129 02:11:26.004912 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:11:37 crc kubenswrapper[4749]: I1129 02:11:37.084534 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:11:37 crc kubenswrapper[4749]: E1129 02:11:37.085770 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:11:51 crc kubenswrapper[4749]: I1129 02:11:51.075149 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:11:51 crc kubenswrapper[4749]: E1129 02:11:51.076290 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:12:02 crc kubenswrapper[4749]: I1129 02:12:02.074680 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:12:02 crc kubenswrapper[4749]: E1129 02:12:02.075589 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:12:15 crc kubenswrapper[4749]: I1129 02:12:15.075668 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:12:15 crc kubenswrapper[4749]: E1129 02:12:15.076413 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:12:27 crc kubenswrapper[4749]: I1129 02:12:27.083934 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:12:27 crc kubenswrapper[4749]: E1129 02:12:27.085063 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:12:40 crc kubenswrapper[4749]: I1129 02:12:40.075561 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:12:40 crc kubenswrapper[4749]: E1129 02:12:40.077380 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:12:51 crc kubenswrapper[4749]: I1129 02:12:51.076967 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:12:51 crc kubenswrapper[4749]: E1129 02:12:51.078456 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:13:02 crc kubenswrapper[4749]: I1129 02:13:02.074827 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:13:02 crc kubenswrapper[4749]: E1129 02:13:02.075843 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:13:16 crc kubenswrapper[4749]: I1129 02:13:16.077852 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:13:16 crc kubenswrapper[4749]: E1129 02:13:16.079007 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:13:31 crc kubenswrapper[4749]: I1129 02:13:31.076254 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:13:31 crc kubenswrapper[4749]: E1129 02:13:31.077422 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:13:45 crc kubenswrapper[4749]: I1129 02:13:45.076145 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:13:45 crc kubenswrapper[4749]: E1129 02:13:45.077347 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:13:58 crc kubenswrapper[4749]: I1129 02:13:58.075603 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:13:58 crc kubenswrapper[4749]: E1129 02:13:58.076740 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:14:09 crc kubenswrapper[4749]: I1129 02:14:09.075356 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:14:09 crc kubenswrapper[4749]: E1129 02:14:09.076439 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:14:23 crc kubenswrapper[4749]: I1129 02:14:23.074449 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:14:23 crc kubenswrapper[4749]: E1129 02:14:23.075039 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:14:34 crc kubenswrapper[4749]: I1129 02:14:34.075475 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:14:34 crc kubenswrapper[4749]: E1129 02:14:34.076142 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:14:49 crc kubenswrapper[4749]: I1129 02:14:49.076480 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:14:49 crc kubenswrapper[4749]: E1129 02:14:49.077797 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.074607 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:15:00 crc kubenswrapper[4749]: E1129 02:15:00.075547 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.207588 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc"] Nov 29 02:15:00 crc kubenswrapper[4749]: E1129 02:15:00.208236 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerName="registry-server" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.208276 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerName="registry-server" Nov 29 02:15:00 crc kubenswrapper[4749]: E1129 02:15:00.208304 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerName="extract-utilities" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.208322 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerName="extract-utilities" Nov 29 02:15:00 crc kubenswrapper[4749]: E1129 02:15:00.208352 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerName="extract-content" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.208372 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerName="extract-content" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.208751 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbf254b4-6f58-42f9-9740-fd6787f1a59f" containerName="registry-server" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.210051 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.212798 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.217171 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.219792 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc"] Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.256249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b7973bd-913f-4441-8e08-9d5ab221b673-config-volume\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.256321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b7973bd-913f-4441-8e08-9d5ab221b673-secret-volume\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.256402 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcfh2\" (UniqueName: \"kubernetes.io/projected/2b7973bd-913f-4441-8e08-9d5ab221b673-kube-api-access-qcfh2\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.357952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b7973bd-913f-4441-8e08-9d5ab221b673-config-volume\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.358012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b7973bd-913f-4441-8e08-9d5ab221b673-secret-volume\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.358075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcfh2\" (UniqueName: \"kubernetes.io/projected/2b7973bd-913f-4441-8e08-9d5ab221b673-kube-api-access-qcfh2\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.359578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b7973bd-913f-4441-8e08-9d5ab221b673-config-volume\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.364744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b7973bd-913f-4441-8e08-9d5ab221b673-secret-volume\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.382532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcfh2\" (UniqueName: \"kubernetes.io/projected/2b7973bd-913f-4441-8e08-9d5ab221b673-kube-api-access-qcfh2\") pod \"collect-profiles-29406375-8f5rc\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:00 crc kubenswrapper[4749]: I1129 02:15:00.531232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:01 crc kubenswrapper[4749]: I1129 02:15:01.021519 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc"] Nov 29 02:15:01 crc kubenswrapper[4749]: I1129 02:15:01.213534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" event={"ID":"2b7973bd-913f-4441-8e08-9d5ab221b673","Type":"ContainerStarted","Data":"2f633c8cfac15f8676af00b34b24107a3153defc94ea298f9aeb3d288bb0a27c"} Nov 29 02:15:01 crc kubenswrapper[4749]: I1129 02:15:01.213585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" event={"ID":"2b7973bd-913f-4441-8e08-9d5ab221b673","Type":"ContainerStarted","Data":"385783de668a68bfad70370f7667ffccd0ae55d74d166fb139240b6703c2de08"} Nov 29 02:15:01 crc kubenswrapper[4749]: I1129 02:15:01.233268 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" podStartSLOduration=1.233252925 podStartE2EDuration="1.233252925s" podCreationTimestamp="2025-11-29 02:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:15:01.23264562 +0000 UTC m=+3844.404795477" watchObservedRunningTime="2025-11-29 02:15:01.233252925 +0000 UTC m=+3844.405402782" Nov 29 02:15:02 crc kubenswrapper[4749]: I1129 02:15:02.223661 4749 generic.go:334] "Generic (PLEG): container finished" podID="2b7973bd-913f-4441-8e08-9d5ab221b673" containerID="2f633c8cfac15f8676af00b34b24107a3153defc94ea298f9aeb3d288bb0a27c" exitCode=0 Nov 29 02:15:02 crc kubenswrapper[4749]: I1129 02:15:02.224018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" event={"ID":"2b7973bd-913f-4441-8e08-9d5ab221b673","Type":"ContainerDied","Data":"2f633c8cfac15f8676af00b34b24107a3153defc94ea298f9aeb3d288bb0a27c"} Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.663984 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.832828 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b7973bd-913f-4441-8e08-9d5ab221b673-secret-volume\") pod \"2b7973bd-913f-4441-8e08-9d5ab221b673\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.832893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b7973bd-913f-4441-8e08-9d5ab221b673-config-volume\") pod \"2b7973bd-913f-4441-8e08-9d5ab221b673\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.832922 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcfh2\" (UniqueName: \"kubernetes.io/projected/2b7973bd-913f-4441-8e08-9d5ab221b673-kube-api-access-qcfh2\") pod \"2b7973bd-913f-4441-8e08-9d5ab221b673\" (UID: \"2b7973bd-913f-4441-8e08-9d5ab221b673\") " Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.833896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7973bd-913f-4441-8e08-9d5ab221b673-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b7973bd-913f-4441-8e08-9d5ab221b673" (UID: "2b7973bd-913f-4441-8e08-9d5ab221b673"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.838381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7973bd-913f-4441-8e08-9d5ab221b673-kube-api-access-qcfh2" (OuterVolumeSpecName: "kube-api-access-qcfh2") pod "2b7973bd-913f-4441-8e08-9d5ab221b673" (UID: "2b7973bd-913f-4441-8e08-9d5ab221b673"). InnerVolumeSpecName "kube-api-access-qcfh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.845349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b7973bd-913f-4441-8e08-9d5ab221b673-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b7973bd-913f-4441-8e08-9d5ab221b673" (UID: "2b7973bd-913f-4441-8e08-9d5ab221b673"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.934883 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b7973bd-913f-4441-8e08-9d5ab221b673-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.934940 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b7973bd-913f-4441-8e08-9d5ab221b673-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 02:15:03 crc kubenswrapper[4749]: I1129 02:15:03.935464 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcfh2\" (UniqueName: \"kubernetes.io/projected/2b7973bd-913f-4441-8e08-9d5ab221b673-kube-api-access-qcfh2\") on node \"crc\" DevicePath \"\"" Nov 29 02:15:04 crc kubenswrapper[4749]: I1129 02:15:04.241436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" event={"ID":"2b7973bd-913f-4441-8e08-9d5ab221b673","Type":"ContainerDied","Data":"385783de668a68bfad70370f7667ffccd0ae55d74d166fb139240b6703c2de08"} Nov 29 02:15:04 crc kubenswrapper[4749]: I1129 02:15:04.241474 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="385783de668a68bfad70370f7667ffccd0ae55d74d166fb139240b6703c2de08" Nov 29 02:15:04 crc kubenswrapper[4749]: I1129 02:15:04.241499 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc" Nov 29 02:15:04 crc kubenswrapper[4749]: I1129 02:15:04.326868 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh"] Nov 29 02:15:04 crc kubenswrapper[4749]: I1129 02:15:04.332183 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406330-bfhgh"] Nov 29 02:15:05 crc kubenswrapper[4749]: I1129 02:15:05.093342 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3ad010e-7987-4781-8c42-3dbbb4006be8" path="/var/lib/kubelet/pods/a3ad010e-7987-4781-8c42-3dbbb4006be8/volumes" Nov 29 02:15:12 crc kubenswrapper[4749]: I1129 02:15:12.735514 4749 scope.go:117] "RemoveContainer" containerID="0a06a14a4ee2ac82fb078478a0a978c83014068cbf32277336124049f0390eb6" Nov 29 02:15:14 crc kubenswrapper[4749]: I1129 02:15:14.075181 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:15:14 crc kubenswrapper[4749]: E1129 02:15:14.075946 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:15:26 crc kubenswrapper[4749]: I1129 02:15:26.075006 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:15:26 crc kubenswrapper[4749]: E1129 02:15:26.075890 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:15:41 crc kubenswrapper[4749]: I1129 02:15:41.076082 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:15:41 crc kubenswrapper[4749]: E1129 02:15:41.077345 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:15:54 crc kubenswrapper[4749]: I1129 02:15:54.074925 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:15:54 crc kubenswrapper[4749]: E1129 02:15:54.075754 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:16:06 crc kubenswrapper[4749]: I1129 02:16:06.075553 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:16:06 crc kubenswrapper[4749]: E1129 02:16:06.076485 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:16:18 crc kubenswrapper[4749]: I1129 02:16:18.075449 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:16:18 crc kubenswrapper[4749]: E1129 02:16:18.076273 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:16:31 crc kubenswrapper[4749]: I1129 02:16:31.075184 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:16:32 crc kubenswrapper[4749]: I1129 02:16:32.166911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"0eeb876091eb9c5b55acd50961cccbb1c033fa7ce763a1aa5544083aba6ec3a3"} Nov 29 02:17:12 crc kubenswrapper[4749]: I1129 02:17:12.792655 4749 scope.go:117] "RemoveContainer" containerID="b303afe00fb2c1fae69092bf334baa2d148390397ac01ed4ceadc9d421f7e498" Nov 29 02:17:12 crc kubenswrapper[4749]: I1129 02:17:12.835503 4749 scope.go:117] "RemoveContainer" containerID="9d72709185eefeb3729834b991042ee65389e90b2d9b2b6ca6eb4fc38a62767f" Nov 29 02:17:12 crc kubenswrapper[4749]: I1129 02:17:12.889609 4749 scope.go:117] "RemoveContainer" containerID="5219ec56b253b21bd2d3e889d5aa9a57db47f3e7775bf1524b23548d2b1285c2" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.351130 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2kbrg"] Nov 29 02:17:24 crc kubenswrapper[4749]: E1129 02:17:24.355389 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7973bd-913f-4441-8e08-9d5ab221b673" containerName="collect-profiles" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.355441 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7973bd-913f-4441-8e08-9d5ab221b673" containerName="collect-profiles" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.356669 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7973bd-913f-4441-8e08-9d5ab221b673" containerName="collect-profiles" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.359366 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.362537 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2kbrg"] Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.517069 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-catalog-content\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.517457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sxbt\" (UniqueName: \"kubernetes.io/projected/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-kube-api-access-6sxbt\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.517602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-utilities\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.619264 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-catalog-content\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.619362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sxbt\" (UniqueName: \"kubernetes.io/projected/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-kube-api-access-6sxbt\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.619412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-utilities\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.620065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-utilities\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.620442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-catalog-content\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.656917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sxbt\" (UniqueName: \"kubernetes.io/projected/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-kube-api-access-6sxbt\") pod \"redhat-operators-2kbrg\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:24 crc kubenswrapper[4749]: I1129 02:17:24.708834 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:25 crc kubenswrapper[4749]: I1129 02:17:25.159771 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2kbrg"] Nov 29 02:17:25 crc kubenswrapper[4749]: I1129 02:17:25.663374 4749 generic.go:334] "Generic (PLEG): container finished" podID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerID="b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec" exitCode=0 Nov 29 02:17:25 crc kubenswrapper[4749]: I1129 02:17:25.663471 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kbrg" event={"ID":"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865","Type":"ContainerDied","Data":"b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec"} Nov 29 02:17:25 crc kubenswrapper[4749]: I1129 02:17:25.663630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kbrg" event={"ID":"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865","Type":"ContainerStarted","Data":"94c14a6a92ad857fdb5a574299daf82465d504f4be82b63108a8c845db2a157e"} Nov 29 02:17:25 crc kubenswrapper[4749]: I1129 02:17:25.665782 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 02:17:27 crc kubenswrapper[4749]: I1129 02:17:27.689454 4749 generic.go:334] "Generic (PLEG): container finished" podID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerID="f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef" exitCode=0 Nov 29 02:17:27 crc kubenswrapper[4749]: I1129 02:17:27.689529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kbrg" event={"ID":"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865","Type":"ContainerDied","Data":"f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef"} Nov 29 02:17:28 crc kubenswrapper[4749]: I1129 02:17:28.715475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kbrg" event={"ID":"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865","Type":"ContainerStarted","Data":"431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120"} Nov 29 02:17:28 crc kubenswrapper[4749]: I1129 02:17:28.747415 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2kbrg" podStartSLOduration=2.192012664 podStartE2EDuration="4.747397027s" podCreationTimestamp="2025-11-29 02:17:24 +0000 UTC" firstStartedPulling="2025-11-29 02:17:25.66557958 +0000 UTC m=+3988.837729437" lastFinishedPulling="2025-11-29 02:17:28.220963923 +0000 UTC m=+3991.393113800" observedRunningTime="2025-11-29 02:17:28.743329209 +0000 UTC m=+3991.915479106" watchObservedRunningTime="2025-11-29 02:17:28.747397027 +0000 UTC m=+3991.919546894" Nov 29 02:17:34 crc kubenswrapper[4749]: I1129 02:17:34.709500 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:34 crc kubenswrapper[4749]: I1129 02:17:34.714229 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:35 crc kubenswrapper[4749]: I1129 02:17:35.790550 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2kbrg" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="registry-server" probeResult="failure" output=< Nov 29 02:17:35 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 02:17:35 crc kubenswrapper[4749]: > Nov 29 02:17:44 crc kubenswrapper[4749]: I1129 02:17:44.785618 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:44 crc kubenswrapper[4749]: I1129 02:17:44.856599 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:45 crc kubenswrapper[4749]: I1129 02:17:45.040479 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2kbrg"] Nov 29 02:17:46 crc kubenswrapper[4749]: I1129 02:17:46.211563 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2kbrg" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="registry-server" containerID="cri-o://431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120" gracePeriod=2 Nov 29 02:17:47 crc kubenswrapper[4749]: I1129 02:17:47.855497 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:47 crc kubenswrapper[4749]: I1129 02:17:47.864141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-utilities\") pod \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " Nov 29 02:17:47 crc kubenswrapper[4749]: I1129 02:17:47.864192 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sxbt\" (UniqueName: \"kubernetes.io/projected/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-kube-api-access-6sxbt\") pod \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " Nov 29 02:17:47 crc kubenswrapper[4749]: I1129 02:17:47.864244 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-catalog-content\") pod \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\" (UID: \"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865\") " Nov 29 02:17:47 crc kubenswrapper[4749]: I1129 02:17:47.865584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-utilities" (OuterVolumeSpecName: "utilities") pod "1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" (UID: "1668d6e0-e6a9-4708-b8f1-97f5e7f0f865"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:17:47 crc kubenswrapper[4749]: I1129 02:17:47.884723 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-kube-api-access-6sxbt" (OuterVolumeSpecName: "kube-api-access-6sxbt") pod "1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" (UID: "1668d6e0-e6a9-4708-b8f1-97f5e7f0f865"). InnerVolumeSpecName "kube-api-access-6sxbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:17:47 crc kubenswrapper[4749]: I1129 02:17:47.966491 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:17:47 crc kubenswrapper[4749]: I1129 02:17:47.966537 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sxbt\" (UniqueName: \"kubernetes.io/projected/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-kube-api-access-6sxbt\") on node \"crc\" DevicePath \"\"" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.025160 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" (UID: "1668d6e0-e6a9-4708-b8f1-97f5e7f0f865"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.068150 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.233343 4749 generic.go:334] "Generic (PLEG): container finished" podID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerID="431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120" exitCode=0 Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.233408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kbrg" event={"ID":"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865","Type":"ContainerDied","Data":"431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120"} Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.233451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kbrg" event={"ID":"1668d6e0-e6a9-4708-b8f1-97f5e7f0f865","Type":"ContainerDied","Data":"94c14a6a92ad857fdb5a574299daf82465d504f4be82b63108a8c845db2a157e"} Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.233473 4749 scope.go:117] "RemoveContainer" containerID="431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.233961 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kbrg" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.284700 4749 scope.go:117] "RemoveContainer" containerID="f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.287266 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2kbrg"] Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.300116 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2kbrg"] Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.316867 4749 scope.go:117] "RemoveContainer" containerID="b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.359899 4749 scope.go:117] "RemoveContainer" containerID="431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120" Nov 29 02:17:48 crc kubenswrapper[4749]: E1129 02:17:48.360432 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120\": container with ID starting with 431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120 not found: ID does not exist" containerID="431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.360523 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120"} err="failed to get container status \"431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120\": rpc error: code = NotFound desc = could not find container \"431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120\": container with ID starting with 431aa38539e11476729e40b03ada4d4c77f31ccaf6a94bd55be6bfd5bfb92120 not found: ID does not exist" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.360565 4749 scope.go:117] "RemoveContainer" containerID="f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef" Nov 29 02:17:48 crc kubenswrapper[4749]: E1129 02:17:48.361512 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef\": container with ID starting with f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef not found: ID does not exist" containerID="f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.361549 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef"} err="failed to get container status \"f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef\": rpc error: code = NotFound desc = could not find container \"f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef\": container with ID starting with f90e100fb5ca41b5f818b0364c535b274c5b6df697a366fbde0d3937558350ef not found: ID does not exist" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.361580 4749 scope.go:117] "RemoveContainer" containerID="b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec" Nov 29 02:17:48 crc kubenswrapper[4749]: E1129 02:17:48.361873 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec\": container with ID starting with b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec not found: ID does not exist" containerID="b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec" Nov 29 02:17:48 crc kubenswrapper[4749]: I1129 02:17:48.361991 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec"} err="failed to get container status \"b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec\": rpc error: code = NotFound desc = could not find container \"b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec\": container with ID starting with b3ed5aeff172cf16057fff5d74d0f0d929df40dc7fff5a322536cbcedf35aaec not found: ID does not exist" Nov 29 02:17:49 crc kubenswrapper[4749]: I1129 02:17:49.092370 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" path="/var/lib/kubelet/pods/1668d6e0-e6a9-4708-b8f1-97f5e7f0f865/volumes" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.144879 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wht47"] Nov 29 02:18:50 crc kubenswrapper[4749]: E1129 02:18:50.146178 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="registry-server" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.146237 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="registry-server" Nov 29 02:18:50 crc kubenswrapper[4749]: E1129 02:18:50.146267 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="extract-content" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.146281 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="extract-content" Nov 29 02:18:50 crc kubenswrapper[4749]: E1129 02:18:50.146305 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="extract-utilities" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.146318 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="extract-utilities" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.146602 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1668d6e0-e6a9-4708-b8f1-97f5e7f0f865" containerName="registry-server" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.148683 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.158163 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wht47"] Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.197489 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-catalog-content\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.197608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-utilities\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.197798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7djvg\" (UniqueName: \"kubernetes.io/projected/70d8111c-3afd-4d86-844b-5df29178a5c4-kube-api-access-7djvg\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.298652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-catalog-content\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.298739 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-utilities\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.298815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7djvg\" (UniqueName: \"kubernetes.io/projected/70d8111c-3afd-4d86-844b-5df29178a5c4-kube-api-access-7djvg\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.299213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-catalog-content\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.299440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-utilities\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.333110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7djvg\" (UniqueName: \"kubernetes.io/projected/70d8111c-3afd-4d86-844b-5df29178a5c4-kube-api-access-7djvg\") pod \"community-operators-wht47\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:50 crc kubenswrapper[4749]: I1129 02:18:50.479964 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wht47" Nov 29 02:18:51 crc kubenswrapper[4749]: I1129 02:18:51.059571 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wht47"] Nov 29 02:18:51 crc kubenswrapper[4749]: I1129 02:18:51.877585 4749 generic.go:334] "Generic (PLEG): container finished" podID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerID="7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261" exitCode=0 Nov 29 02:18:51 crc kubenswrapper[4749]: I1129 02:18:51.877648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wht47" event={"ID":"70d8111c-3afd-4d86-844b-5df29178a5c4","Type":"ContainerDied","Data":"7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261"} Nov 29 02:18:51 crc kubenswrapper[4749]: I1129 02:18:51.877973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wht47" event={"ID":"70d8111c-3afd-4d86-844b-5df29178a5c4","Type":"ContainerStarted","Data":"e24e63d5ee7707474006564023b4c5d4d223e4591f4d9de8789264fb47647bf7"} Nov 29 02:18:53 crc kubenswrapper[4749]: I1129 02:18:53.900506 4749 generic.go:334] "Generic (PLEG): container finished" podID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerID="69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27" exitCode=0 Nov 29 02:18:53 crc kubenswrapper[4749]: I1129 02:18:53.900587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wht47" event={"ID":"70d8111c-3afd-4d86-844b-5df29178a5c4","Type":"ContainerDied","Data":"69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27"} Nov 29 02:18:54 crc kubenswrapper[4749]: I1129 02:18:54.913970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wht47" event={"ID":"70d8111c-3afd-4d86-844b-5df29178a5c4","Type":"ContainerStarted","Data":"39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07"} Nov 29 02:18:54 crc kubenswrapper[4749]: I1129 02:18:54.938972 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wht47" podStartSLOduration=2.3897448519999998 podStartE2EDuration="4.93895405s" podCreationTimestamp="2025-11-29 02:18:50 +0000 UTC" firstStartedPulling="2025-11-29 02:18:51.880048498 +0000 UTC m=+4075.052198365" lastFinishedPulling="2025-11-29 02:18:54.429257666 +0000 UTC m=+4077.601407563" observedRunningTime="2025-11-29 02:18:54.936071321 +0000 UTC m=+4078.108221218" watchObservedRunningTime="2025-11-29 02:18:54.93895405 +0000 UTC m=+4078.111103917" Nov 29 02:18:55 crc kubenswrapper[4749]: I1129 02:18:55.374380 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:18:55 crc kubenswrapper[4749]: I1129 02:18:55.374430 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:19:00 crc kubenswrapper[4749]: I1129 02:19:00.481368 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wht47" Nov 29 02:19:00 crc kubenswrapper[4749]: I1129 02:19:00.482052 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wht47" Nov 29 02:19:00 crc kubenswrapper[4749]: I1129 02:19:00.543374 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wht47" Nov 29 02:19:01 crc kubenswrapper[4749]: I1129 02:19:01.045436 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wht47" Nov 29 02:19:01 crc kubenswrapper[4749]: I1129 02:19:01.132679 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wht47"] Nov 29 02:19:02 crc kubenswrapper[4749]: I1129 02:19:02.981929 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wht47" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerName="registry-server" containerID="cri-o://39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07" gracePeriod=2 Nov 29 02:19:03 crc kubenswrapper[4749]: I1129 02:19:03.860613 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wht47" Nov 29 02:19:03 crc kubenswrapper[4749]: I1129 02:19:03.993584 4749 generic.go:334] "Generic (PLEG): container finished" podID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerID="39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07" exitCode=0 Nov 29 02:19:03 crc kubenswrapper[4749]: I1129 02:19:03.993648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wht47" event={"ID":"70d8111c-3afd-4d86-844b-5df29178a5c4","Type":"ContainerDied","Data":"39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07"} Nov 29 02:19:03 crc kubenswrapper[4749]: I1129 02:19:03.993690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wht47" event={"ID":"70d8111c-3afd-4d86-844b-5df29178a5c4","Type":"ContainerDied","Data":"e24e63d5ee7707474006564023b4c5d4d223e4591f4d9de8789264fb47647bf7"} Nov 29 02:19:03 crc kubenswrapper[4749]: I1129 02:19:03.993719 4749 scope.go:117] "RemoveContainer" containerID="39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07" Nov 29 02:19:03 crc kubenswrapper[4749]: I1129 02:19:03.993891 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wht47" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.007568 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-catalog-content\") pod \"70d8111c-3afd-4d86-844b-5df29178a5c4\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.007649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-utilities\") pod \"70d8111c-3afd-4d86-844b-5df29178a5c4\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.007802 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7djvg\" (UniqueName: \"kubernetes.io/projected/70d8111c-3afd-4d86-844b-5df29178a5c4-kube-api-access-7djvg\") pod \"70d8111c-3afd-4d86-844b-5df29178a5c4\" (UID: \"70d8111c-3afd-4d86-844b-5df29178a5c4\") " Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.009454 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-utilities" (OuterVolumeSpecName: "utilities") pod "70d8111c-3afd-4d86-844b-5df29178a5c4" (UID: "70d8111c-3afd-4d86-844b-5df29178a5c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.018025 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70d8111c-3afd-4d86-844b-5df29178a5c4-kube-api-access-7djvg" (OuterVolumeSpecName: "kube-api-access-7djvg") pod "70d8111c-3afd-4d86-844b-5df29178a5c4" (UID: "70d8111c-3afd-4d86-844b-5df29178a5c4"). InnerVolumeSpecName "kube-api-access-7djvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.023984 4749 scope.go:117] "RemoveContainer" containerID="69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.074867 4749 scope.go:117] "RemoveContainer" containerID="7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.084452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70d8111c-3afd-4d86-844b-5df29178a5c4" (UID: "70d8111c-3afd-4d86-844b-5df29178a5c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.110568 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.110625 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70d8111c-3afd-4d86-844b-5df29178a5c4-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.110642 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7djvg\" (UniqueName: \"kubernetes.io/projected/70d8111c-3afd-4d86-844b-5df29178a5c4-kube-api-access-7djvg\") on node \"crc\" DevicePath \"\"" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.133353 4749 scope.go:117] "RemoveContainer" containerID="39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07" Nov 29 02:19:04 crc kubenswrapper[4749]: E1129 02:19:04.133861 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07\": container with ID starting with 39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07 not found: ID does not exist" containerID="39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.133904 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07"} err="failed to get container status \"39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07\": rpc error: code = NotFound desc = could not find container \"39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07\": container with ID starting with 39a0b3a37d0979b7993b7834bf5894160086481c61db469ba85fbdd4589edd07 not found: ID does not exist" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.133931 4749 scope.go:117] "RemoveContainer" containerID="69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27" Nov 29 02:19:04 crc kubenswrapper[4749]: E1129 02:19:04.134502 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27\": container with ID starting with 69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27 not found: ID does not exist" containerID="69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.134539 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27"} err="failed to get container status \"69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27\": rpc error: code = NotFound desc = could not find container \"69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27\": container with ID starting with 69943c0c91b931e4074b06e188c7360cc14b6bd1038981cc8972e8f3466ddf27 not found: ID does not exist" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.134558 4749 scope.go:117] "RemoveContainer" containerID="7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261" Nov 29 02:19:04 crc kubenswrapper[4749]: E1129 02:19:04.135080 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261\": container with ID starting with 7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261 not found: ID does not exist" containerID="7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.135128 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261"} err="failed to get container status \"7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261\": rpc error: code = NotFound desc = could not find container \"7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261\": container with ID starting with 7686a28ae8107e98dbc01ff0ebfff314dd0ffcc4cfb3dba33f21f5ad93719261 not found: ID does not exist" Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.349377 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wht47"] Nov 29 02:19:04 crc kubenswrapper[4749]: I1129 02:19:04.358035 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wht47"] Nov 29 02:19:05 crc kubenswrapper[4749]: I1129 02:19:05.090353 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" path="/var/lib/kubelet/pods/70d8111c-3afd-4d86-844b-5df29178a5c4/volumes" Nov 29 02:19:25 crc kubenswrapper[4749]: I1129 02:19:25.374680 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:19:25 crc kubenswrapper[4749]: I1129 02:19:25.375325 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:19:55 crc kubenswrapper[4749]: I1129 02:19:55.374530 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:19:55 crc kubenswrapper[4749]: I1129 02:19:55.375268 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:19:55 crc kubenswrapper[4749]: I1129 02:19:55.375349 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:19:55 crc kubenswrapper[4749]: I1129 02:19:55.376235 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0eeb876091eb9c5b55acd50961cccbb1c033fa7ce763a1aa5544083aba6ec3a3"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:19:55 crc kubenswrapper[4749]: I1129 02:19:55.376336 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://0eeb876091eb9c5b55acd50961cccbb1c033fa7ce763a1aa5544083aba6ec3a3" gracePeriod=600 Nov 29 02:19:55 crc kubenswrapper[4749]: I1129 02:19:55.520806 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="0eeb876091eb9c5b55acd50961cccbb1c033fa7ce763a1aa5544083aba6ec3a3" exitCode=0 Nov 29 02:19:55 crc kubenswrapper[4749]: I1129 02:19:55.520886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"0eeb876091eb9c5b55acd50961cccbb1c033fa7ce763a1aa5544083aba6ec3a3"} Nov 29 02:19:55 crc kubenswrapper[4749]: I1129 02:19:55.521326 4749 scope.go:117] "RemoveContainer" containerID="e29c0426c97f627ca3b8d92b15d01ea669190b4ae2e868477bd76000c3c0b14e" Nov 29 02:19:56 crc kubenswrapper[4749]: I1129 02:19:56.561181 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c"} Nov 29 02:21:55 crc kubenswrapper[4749]: I1129 02:21:55.374507 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:21:55 crc kubenswrapper[4749]: I1129 02:21:55.375126 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:22:25 crc kubenswrapper[4749]: I1129 02:22:25.374492 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:22:25 crc kubenswrapper[4749]: I1129 02:22:25.375000 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:22:55 crc kubenswrapper[4749]: I1129 02:22:55.374771 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:22:55 crc kubenswrapper[4749]: I1129 02:22:55.375395 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:22:55 crc kubenswrapper[4749]: I1129 02:22:55.375457 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:22:55 crc kubenswrapper[4749]: I1129 02:22:55.376228 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:22:55 crc kubenswrapper[4749]: I1129 02:22:55.376319 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" gracePeriod=600 Nov 29 02:22:55 crc kubenswrapper[4749]: E1129 02:22:55.511270 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:22:56 crc kubenswrapper[4749]: I1129 02:22:56.262144 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" exitCode=0 Nov 29 02:22:56 crc kubenswrapper[4749]: I1129 02:22:56.262237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c"} Nov 29 02:22:56 crc kubenswrapper[4749]: I1129 02:22:56.262327 4749 scope.go:117] "RemoveContainer" containerID="0eeb876091eb9c5b55acd50961cccbb1c033fa7ce763a1aa5544083aba6ec3a3" Nov 29 02:22:56 crc kubenswrapper[4749]: I1129 02:22:56.263194 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:22:56 crc kubenswrapper[4749]: E1129 02:22:56.263810 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:23:07 crc kubenswrapper[4749]: I1129 02:23:07.083906 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:23:07 crc kubenswrapper[4749]: E1129 02:23:07.084918 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:23:19 crc kubenswrapper[4749]: I1129 02:23:19.075489 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:23:19 crc kubenswrapper[4749]: E1129 02:23:19.076264 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:23:30 crc kubenswrapper[4749]: I1129 02:23:30.075475 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:23:30 crc kubenswrapper[4749]: E1129 02:23:30.076582 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:23:42 crc kubenswrapper[4749]: I1129 02:23:42.075161 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:23:42 crc kubenswrapper[4749]: E1129 02:23:42.076126 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:23:54 crc kubenswrapper[4749]: I1129 02:23:54.076170 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:23:54 crc kubenswrapper[4749]: E1129 02:23:54.077151 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.486244 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2jscg"] Nov 29 02:23:56 crc kubenswrapper[4749]: E1129 02:23:56.487243 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerName="extract-utilities" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.487277 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerName="extract-utilities" Nov 29 02:23:56 crc kubenswrapper[4749]: E1129 02:23:56.487309 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerName="registry-server" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.487326 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerName="registry-server" Nov 29 02:23:56 crc kubenswrapper[4749]: E1129 02:23:56.487362 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerName="extract-content" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.487379 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerName="extract-content" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.487707 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="70d8111c-3afd-4d86-844b-5df29178a5c4" containerName="registry-server" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.489655 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.504447 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jscg"] Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.676716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-catalog-content\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.676767 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-utilities\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.676843 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4rbj\" (UniqueName: \"kubernetes.io/projected/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-kube-api-access-c4rbj\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.779087 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4rbj\" (UniqueName: \"kubernetes.io/projected/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-kube-api-access-c4rbj\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.779381 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-catalog-content\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.779444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-utilities\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.780044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-catalog-content\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.780363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-utilities\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:56 crc kubenswrapper[4749]: I1129 02:23:56.822057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4rbj\" (UniqueName: \"kubernetes.io/projected/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-kube-api-access-c4rbj\") pod \"redhat-marketplace-2jscg\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:57 crc kubenswrapper[4749]: I1129 02:23:57.113491 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:23:57 crc kubenswrapper[4749]: I1129 02:23:57.420673 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jscg"] Nov 29 02:23:57 crc kubenswrapper[4749]: I1129 02:23:57.844750 4749 generic.go:334] "Generic (PLEG): container finished" podID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerID="e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc" exitCode=0 Nov 29 02:23:57 crc kubenswrapper[4749]: I1129 02:23:57.845268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jscg" event={"ID":"e588ec22-e995-4c59-a1db-2bfc07c0fdcf","Type":"ContainerDied","Data":"e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc"} Nov 29 02:23:57 crc kubenswrapper[4749]: I1129 02:23:57.845359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jscg" event={"ID":"e588ec22-e995-4c59-a1db-2bfc07c0fdcf","Type":"ContainerStarted","Data":"9b293f440e369c9830a19f917ac3499e2a856566f1ba66e13cd77f70c9e5955e"} Nov 29 02:23:57 crc kubenswrapper[4749]: I1129 02:23:57.850584 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 02:23:58 crc kubenswrapper[4749]: I1129 02:23:58.874796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jscg" event={"ID":"e588ec22-e995-4c59-a1db-2bfc07c0fdcf","Type":"ContainerStarted","Data":"3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207"} Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.687955 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mgq5q"] Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.690640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.698809 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgq5q"] Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.755343 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-utilities\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.755509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wftk8\" (UniqueName: \"kubernetes.io/projected/0b40ba3b-6ded-4215-830e-d71c1f120b43-kube-api-access-wftk8\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.755556 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-catalog-content\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.857099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-utilities\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.857224 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wftk8\" (UniqueName: \"kubernetes.io/projected/0b40ba3b-6ded-4215-830e-d71c1f120b43-kube-api-access-wftk8\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.857257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-catalog-content\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.857874 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-catalog-content\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.858094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-utilities\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.885826 4749 generic.go:334] "Generic (PLEG): container finished" podID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerID="3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207" exitCode=0 Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.885874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jscg" event={"ID":"e588ec22-e995-4c59-a1db-2bfc07c0fdcf","Type":"ContainerDied","Data":"3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207"} Nov 29 02:23:59 crc kubenswrapper[4749]: I1129 02:23:59.886930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wftk8\" (UniqueName: \"kubernetes.io/projected/0b40ba3b-6ded-4215-830e-d71c1f120b43-kube-api-access-wftk8\") pod \"certified-operators-mgq5q\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:24:00 crc kubenswrapper[4749]: I1129 02:24:00.027433 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:24:00 crc kubenswrapper[4749]: I1129 02:24:00.303785 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgq5q"] Nov 29 02:24:00 crc kubenswrapper[4749]: W1129 02:24:00.315906 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b40ba3b_6ded_4215_830e_d71c1f120b43.slice/crio-2feb3347463663c60281a3239e28cb7c20e7d8c20b641731e06043bddebcbc3e WatchSource:0}: Error finding container 2feb3347463663c60281a3239e28cb7c20e7d8c20b641731e06043bddebcbc3e: Status 404 returned error can't find the container with id 2feb3347463663c60281a3239e28cb7c20e7d8c20b641731e06043bddebcbc3e Nov 29 02:24:00 crc kubenswrapper[4749]: I1129 02:24:00.896667 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jscg" event={"ID":"e588ec22-e995-4c59-a1db-2bfc07c0fdcf","Type":"ContainerStarted","Data":"9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6"} Nov 29 02:24:00 crc kubenswrapper[4749]: I1129 02:24:00.898679 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerID="cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474" exitCode=0 Nov 29 02:24:00 crc kubenswrapper[4749]: I1129 02:24:00.898750 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgq5q" event={"ID":"0b40ba3b-6ded-4215-830e-d71c1f120b43","Type":"ContainerDied","Data":"cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474"} Nov 29 02:24:00 crc kubenswrapper[4749]: I1129 02:24:00.898799 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgq5q" event={"ID":"0b40ba3b-6ded-4215-830e-d71c1f120b43","Type":"ContainerStarted","Data":"2feb3347463663c60281a3239e28cb7c20e7d8c20b641731e06043bddebcbc3e"} Nov 29 02:24:00 crc kubenswrapper[4749]: I1129 02:24:00.925011 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2jscg" podStartSLOduration=2.312657013 podStartE2EDuration="4.924988524s" podCreationTimestamp="2025-11-29 02:23:56 +0000 UTC" firstStartedPulling="2025-11-29 02:23:57.85015673 +0000 UTC m=+4381.022306617" lastFinishedPulling="2025-11-29 02:24:00.462488261 +0000 UTC m=+4383.634638128" observedRunningTime="2025-11-29 02:24:00.922274609 +0000 UTC m=+4384.094424536" watchObservedRunningTime="2025-11-29 02:24:00.924988524 +0000 UTC m=+4384.097138391" Nov 29 02:24:01 crc kubenswrapper[4749]: I1129 02:24:01.907857 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerID="44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665" exitCode=0 Nov 29 02:24:01 crc kubenswrapper[4749]: I1129 02:24:01.907954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgq5q" event={"ID":"0b40ba3b-6ded-4215-830e-d71c1f120b43","Type":"ContainerDied","Data":"44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665"} Nov 29 02:24:02 crc kubenswrapper[4749]: I1129 02:24:02.919249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgq5q" event={"ID":"0b40ba3b-6ded-4215-830e-d71c1f120b43","Type":"ContainerStarted","Data":"645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181"} Nov 29 02:24:02 crc kubenswrapper[4749]: I1129 02:24:02.952233 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mgq5q" podStartSLOduration=2.527101373 podStartE2EDuration="3.952175402s" podCreationTimestamp="2025-11-29 02:23:59 +0000 UTC" firstStartedPulling="2025-11-29 02:24:00.900448476 +0000 UTC m=+4384.072598373" lastFinishedPulling="2025-11-29 02:24:02.325522545 +0000 UTC m=+4385.497672402" observedRunningTime="2025-11-29 02:24:02.944389846 +0000 UTC m=+4386.116539723" watchObservedRunningTime="2025-11-29 02:24:02.952175402 +0000 UTC m=+4386.124325299" Nov 29 02:24:07 crc kubenswrapper[4749]: I1129 02:24:07.114827 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:24:07 crc kubenswrapper[4749]: I1129 02:24:07.115890 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:24:07 crc kubenswrapper[4749]: I1129 02:24:07.192060 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:24:08 crc kubenswrapper[4749]: I1129 02:24:08.037764 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:24:08 crc kubenswrapper[4749]: I1129 02:24:08.075542 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:24:08 crc kubenswrapper[4749]: E1129 02:24:08.076074 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:24:08 crc kubenswrapper[4749]: I1129 02:24:08.110675 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jscg"] Nov 29 02:24:09 crc kubenswrapper[4749]: I1129 02:24:09.982437 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2jscg" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerName="registry-server" containerID="cri-o://9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6" gracePeriod=2 Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.027669 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.027770 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.102038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.984676 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.997753 4749 generic.go:334] "Generic (PLEG): container finished" podID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerID="9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6" exitCode=0 Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.997831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jscg" event={"ID":"e588ec22-e995-4c59-a1db-2bfc07c0fdcf","Type":"ContainerDied","Data":"9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6"} Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.997849 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jscg" Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.997876 4749 scope.go:117] "RemoveContainer" containerID="9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6" Nov 29 02:24:10 crc kubenswrapper[4749]: I1129 02:24:10.997864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jscg" event={"ID":"e588ec22-e995-4c59-a1db-2bfc07c0fdcf","Type":"ContainerDied","Data":"9b293f440e369c9830a19f917ac3499e2a856566f1ba66e13cd77f70c9e5955e"} Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.051865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4rbj\" (UniqueName: \"kubernetes.io/projected/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-kube-api-access-c4rbj\") pod \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.051955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-utilities\") pod \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.052018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-catalog-content\") pod \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\" (UID: \"e588ec22-e995-4c59-a1db-2bfc07c0fdcf\") " Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.056234 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-utilities" (OuterVolumeSpecName: "utilities") pod "e588ec22-e995-4c59-a1db-2bfc07c0fdcf" (UID: "e588ec22-e995-4c59-a1db-2bfc07c0fdcf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.068119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-kube-api-access-c4rbj" (OuterVolumeSpecName: "kube-api-access-c4rbj") pod "e588ec22-e995-4c59-a1db-2bfc07c0fdcf" (UID: "e588ec22-e995-4c59-a1db-2bfc07c0fdcf"). InnerVolumeSpecName "kube-api-access-c4rbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.073185 4749 scope.go:117] "RemoveContainer" containerID="3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.080229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e588ec22-e995-4c59-a1db-2bfc07c0fdcf" (UID: "e588ec22-e995-4c59-a1db-2bfc07c0fdcf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.097974 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.104112 4749 scope.go:117] "RemoveContainer" containerID="e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.133728 4749 scope.go:117] "RemoveContainer" containerID="9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6" Nov 29 02:24:11 crc kubenswrapper[4749]: E1129 02:24:11.134473 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6\": container with ID starting with 9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6 not found: ID does not exist" containerID="9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.134593 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6"} err="failed to get container status \"9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6\": rpc error: code = NotFound desc = could not find container \"9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6\": container with ID starting with 9bbf489ab436cfcc44a97870942ca3d66b82e10420d63851c6e79e9ea6220bd6 not found: ID does not exist" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.134712 4749 scope.go:117] "RemoveContainer" containerID="3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207" Nov 29 02:24:11 crc kubenswrapper[4749]: E1129 02:24:11.135494 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207\": container with ID starting with 3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207 not found: ID does not exist" containerID="3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.135556 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207"} err="failed to get container status \"3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207\": rpc error: code = NotFound desc = could not find container \"3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207\": container with ID starting with 3983bcac4576847ff064cf9f4ec74dbdf529ec5e56ff1ab3d389f49e3695b207 not found: ID does not exist" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.135601 4749 scope.go:117] "RemoveContainer" containerID="e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc" Nov 29 02:24:11 crc kubenswrapper[4749]: E1129 02:24:11.136117 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc\": container with ID starting with e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc not found: ID does not exist" containerID="e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.136423 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc"} err="failed to get container status \"e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc\": rpc error: code = NotFound desc = could not find container \"e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc\": container with ID starting with e52b9b97a36b43b0d68651c837366afb6c5ba63aba57b83a771e0f2175998ddc not found: ID does not exist" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.153614 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4rbj\" (UniqueName: \"kubernetes.io/projected/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-kube-api-access-c4rbj\") on node \"crc\" DevicePath \"\"" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.153724 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.153804 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e588ec22-e995-4c59-a1db-2bfc07c0fdcf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.326338 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jscg"] Nov 29 02:24:11 crc kubenswrapper[4749]: I1129 02:24:11.331120 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jscg"] Nov 29 02:24:12 crc kubenswrapper[4749]: I1129 02:24:12.245383 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgq5q"] Nov 29 02:24:13 crc kubenswrapper[4749]: I1129 02:24:13.093876 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" path="/var/lib/kubelet/pods/e588ec22-e995-4c59-a1db-2bfc07c0fdcf/volumes" Nov 29 02:24:14 crc kubenswrapper[4749]: I1129 02:24:14.033795 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mgq5q" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerName="registry-server" containerID="cri-o://645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181" gracePeriod=2 Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.039062 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.042909 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerID="645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181" exitCode=0 Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.042993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgq5q" event={"ID":"0b40ba3b-6ded-4215-830e-d71c1f120b43","Type":"ContainerDied","Data":"645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181"} Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.043018 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgq5q" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.043078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgq5q" event={"ID":"0b40ba3b-6ded-4215-830e-d71c1f120b43","Type":"ContainerDied","Data":"2feb3347463663c60281a3239e28cb7c20e7d8c20b641731e06043bddebcbc3e"} Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.043130 4749 scope.go:117] "RemoveContainer" containerID="645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.081371 4749 scope.go:117] "RemoveContainer" containerID="44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.107417 4749 scope.go:117] "RemoveContainer" containerID="cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.120307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-catalog-content\") pod \"0b40ba3b-6ded-4215-830e-d71c1f120b43\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.120362 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-utilities\") pod \"0b40ba3b-6ded-4215-830e-d71c1f120b43\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.120446 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wftk8\" (UniqueName: \"kubernetes.io/projected/0b40ba3b-6ded-4215-830e-d71c1f120b43-kube-api-access-wftk8\") pod \"0b40ba3b-6ded-4215-830e-d71c1f120b43\" (UID: \"0b40ba3b-6ded-4215-830e-d71c1f120b43\") " Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.121977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-utilities" (OuterVolumeSpecName: "utilities") pod "0b40ba3b-6ded-4215-830e-d71c1f120b43" (UID: "0b40ba3b-6ded-4215-830e-d71c1f120b43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.122074 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.128835 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b40ba3b-6ded-4215-830e-d71c1f120b43-kube-api-access-wftk8" (OuterVolumeSpecName: "kube-api-access-wftk8") pod "0b40ba3b-6ded-4215-830e-d71c1f120b43" (UID: "0b40ba3b-6ded-4215-830e-d71c1f120b43"). InnerVolumeSpecName "kube-api-access-wftk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.136441 4749 scope.go:117] "RemoveContainer" containerID="645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181" Nov 29 02:24:15 crc kubenswrapper[4749]: E1129 02:24:15.137942 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181\": container with ID starting with 645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181 not found: ID does not exist" containerID="645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.137998 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181"} err="failed to get container status \"645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181\": rpc error: code = NotFound desc = could not find container \"645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181\": container with ID starting with 645deab72c8f9b579f447b3c3ad6c8ea9abbcb0264678ebee707e9b4bd38e181 not found: ID does not exist" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.138034 4749 scope.go:117] "RemoveContainer" containerID="44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665" Nov 29 02:24:15 crc kubenswrapper[4749]: E1129 02:24:15.138403 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665\": container with ID starting with 44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665 not found: ID does not exist" containerID="44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.138435 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665"} err="failed to get container status \"44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665\": rpc error: code = NotFound desc = could not find container \"44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665\": container with ID starting with 44f831cda22a00d024b589f4346ed3654fe9692ec6cd6afcf4bf7be84d4d5665 not found: ID does not exist" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.138455 4749 scope.go:117] "RemoveContainer" containerID="cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474" Nov 29 02:24:15 crc kubenswrapper[4749]: E1129 02:24:15.138808 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474\": container with ID starting with cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474 not found: ID does not exist" containerID="cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.138836 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474"} err="failed to get container status \"cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474\": rpc error: code = NotFound desc = could not find container \"cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474\": container with ID starting with cfe3b361a2116c7c18f277d8eb5066556ef36b7afd7b1a464046da1143f72474 not found: ID does not exist" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.172999 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b40ba3b-6ded-4215-830e-d71c1f120b43" (UID: "0b40ba3b-6ded-4215-830e-d71c1f120b43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.223159 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wftk8\" (UniqueName: \"kubernetes.io/projected/0b40ba3b-6ded-4215-830e-d71c1f120b43-kube-api-access-wftk8\") on node \"crc\" DevicePath \"\"" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.223273 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b40ba3b-6ded-4215-830e-d71c1f120b43-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.397434 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgq5q"] Nov 29 02:24:15 crc kubenswrapper[4749]: I1129 02:24:15.407899 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mgq5q"] Nov 29 02:24:17 crc kubenswrapper[4749]: I1129 02:24:17.092379 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" path="/var/lib/kubelet/pods/0b40ba3b-6ded-4215-830e-d71c1f120b43/volumes" Nov 29 02:24:23 crc kubenswrapper[4749]: I1129 02:24:23.075636 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:24:23 crc kubenswrapper[4749]: E1129 02:24:23.076359 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:24:34 crc kubenswrapper[4749]: I1129 02:24:34.075988 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:24:34 crc kubenswrapper[4749]: E1129 02:24:34.077081 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:24:46 crc kubenswrapper[4749]: I1129 02:24:46.075905 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:24:46 crc kubenswrapper[4749]: E1129 02:24:46.078780 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:24:59 crc kubenswrapper[4749]: I1129 02:24:59.075734 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:24:59 crc kubenswrapper[4749]: E1129 02:24:59.078013 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:25:13 crc kubenswrapper[4749]: I1129 02:25:13.074964 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:25:13 crc kubenswrapper[4749]: E1129 02:25:13.075937 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:25:26 crc kubenswrapper[4749]: I1129 02:25:26.075192 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:25:26 crc kubenswrapper[4749]: E1129 02:25:26.076052 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:25:37 crc kubenswrapper[4749]: I1129 02:25:37.083674 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:25:37 crc kubenswrapper[4749]: E1129 02:25:37.084783 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:25:49 crc kubenswrapper[4749]: I1129 02:25:49.076407 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:25:49 crc kubenswrapper[4749]: E1129 02:25:49.078026 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:26:00 crc kubenswrapper[4749]: I1129 02:26:00.075593 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:26:00 crc kubenswrapper[4749]: E1129 02:26:00.076315 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:26:11 crc kubenswrapper[4749]: I1129 02:26:11.075805 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:26:11 crc kubenswrapper[4749]: E1129 02:26:11.076671 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:26:24 crc kubenswrapper[4749]: I1129 02:26:24.074948 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:26:24 crc kubenswrapper[4749]: E1129 02:26:24.075679 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:26:37 crc kubenswrapper[4749]: I1129 02:26:37.086748 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:26:37 crc kubenswrapper[4749]: E1129 02:26:37.088075 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:26:49 crc kubenswrapper[4749]: I1129 02:26:49.075110 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:26:49 crc kubenswrapper[4749]: E1129 02:26:49.076043 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:27:02 crc kubenswrapper[4749]: I1129 02:27:02.077899 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:27:02 crc kubenswrapper[4749]: E1129 02:27:02.082161 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:27:17 crc kubenswrapper[4749]: I1129 02:27:17.084000 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:27:17 crc kubenswrapper[4749]: E1129 02:27:17.085375 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:27:29 crc kubenswrapper[4749]: I1129 02:27:29.075666 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:27:29 crc kubenswrapper[4749]: E1129 02:27:29.076630 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:27:44 crc kubenswrapper[4749]: I1129 02:27:44.075170 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:27:44 crc kubenswrapper[4749]: E1129 02:27:44.075748 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.070427 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mm2fh"] Nov 29 02:27:52 crc kubenswrapper[4749]: E1129 02:27:52.072369 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerName="extract-content" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.072399 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerName="extract-content" Nov 29 02:27:52 crc kubenswrapper[4749]: E1129 02:27:52.072418 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerName="extract-utilities" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.072427 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerName="extract-utilities" Nov 29 02:27:52 crc kubenswrapper[4749]: E1129 02:27:52.072435 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerName="extract-content" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.072441 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerName="extract-content" Nov 29 02:27:52 crc kubenswrapper[4749]: E1129 02:27:52.072449 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerName="registry-server" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.072455 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerName="registry-server" Nov 29 02:27:52 crc kubenswrapper[4749]: E1129 02:27:52.072480 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerName="extract-utilities" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.072488 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerName="extract-utilities" Nov 29 02:27:52 crc kubenswrapper[4749]: E1129 02:27:52.072505 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerName="registry-server" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.072617 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerName="registry-server" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.072789 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e588ec22-e995-4c59-a1db-2bfc07c0fdcf" containerName="registry-server" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.072808 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b40ba3b-6ded-4215-830e-d71c1f120b43" containerName="registry-server" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.077092 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.083626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mm2fh"] Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.207899 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrhcn\" (UniqueName: \"kubernetes.io/projected/47ac71b9-98bf-4241-832a-293182733c7a-kube-api-access-nrhcn\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.208123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-catalog-content\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.208166 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-utilities\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.309048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-catalog-content\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.309101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-utilities\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.309121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrhcn\" (UniqueName: \"kubernetes.io/projected/47ac71b9-98bf-4241-832a-293182733c7a-kube-api-access-nrhcn\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.309753 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-utilities\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.310041 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-catalog-content\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.335286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrhcn\" (UniqueName: \"kubernetes.io/projected/47ac71b9-98bf-4241-832a-293182733c7a-kube-api-access-nrhcn\") pod \"redhat-operators-mm2fh\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.408442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:27:52 crc kubenswrapper[4749]: I1129 02:27:52.850533 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mm2fh"] Nov 29 02:27:53 crc kubenswrapper[4749]: I1129 02:27:53.058253 4749 generic.go:334] "Generic (PLEG): container finished" podID="47ac71b9-98bf-4241-832a-293182733c7a" containerID="0c49c573d0b6e67b6360e728276c12802ddc7957e7cd553de061cfcf2a644013" exitCode=0 Nov 29 02:27:53 crc kubenswrapper[4749]: I1129 02:27:53.058355 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm2fh" event={"ID":"47ac71b9-98bf-4241-832a-293182733c7a","Type":"ContainerDied","Data":"0c49c573d0b6e67b6360e728276c12802ddc7957e7cd553de061cfcf2a644013"} Nov 29 02:27:53 crc kubenswrapper[4749]: I1129 02:27:53.058656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm2fh" event={"ID":"47ac71b9-98bf-4241-832a-293182733c7a","Type":"ContainerStarted","Data":"c29ddda5bc1e2b57c36f3b238e5fc9ef04a416b86faa924ad3fc5b15ec4cefa0"} Nov 29 02:27:55 crc kubenswrapper[4749]: I1129 02:27:55.081884 4749 generic.go:334] "Generic (PLEG): container finished" podID="47ac71b9-98bf-4241-832a-293182733c7a" containerID="5c45a5e01034ab40a202dcd5f18f6bb1688d6f7ea2250a645a5e7e362df3d211" exitCode=0 Nov 29 02:27:55 crc kubenswrapper[4749]: I1129 02:27:55.090553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm2fh" event={"ID":"47ac71b9-98bf-4241-832a-293182733c7a","Type":"ContainerDied","Data":"5c45a5e01034ab40a202dcd5f18f6bb1688d6f7ea2250a645a5e7e362df3d211"} Nov 29 02:27:56 crc kubenswrapper[4749]: I1129 02:27:56.092319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm2fh" event={"ID":"47ac71b9-98bf-4241-832a-293182733c7a","Type":"ContainerStarted","Data":"a429184640821604370950b48bb7c12b4070ba63e486816d323de9773bdfe0cc"} Nov 29 02:27:56 crc kubenswrapper[4749]: I1129 02:27:56.123785 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mm2fh" podStartSLOduration=1.6512103900000001 podStartE2EDuration="4.123765558s" podCreationTimestamp="2025-11-29 02:27:52 +0000 UTC" firstStartedPulling="2025-11-29 02:27:53.060297287 +0000 UTC m=+4616.232447154" lastFinishedPulling="2025-11-29 02:27:55.532852455 +0000 UTC m=+4618.705002322" observedRunningTime="2025-11-29 02:27:56.110545519 +0000 UTC m=+4619.282695386" watchObservedRunningTime="2025-11-29 02:27:56.123765558 +0000 UTC m=+4619.295915425" Nov 29 02:27:58 crc kubenswrapper[4749]: I1129 02:27:58.074969 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:27:59 crc kubenswrapper[4749]: I1129 02:27:59.132416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"df2ae3add145d734c1564dba300d08f14f65ca639c4b0a89cde4ceb1f506a682"} Nov 29 02:28:02 crc kubenswrapper[4749]: I1129 02:28:02.409366 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:28:02 crc kubenswrapper[4749]: I1129 02:28:02.409931 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:28:02 crc kubenswrapper[4749]: I1129 02:28:02.490150 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:28:03 crc kubenswrapper[4749]: I1129 02:28:03.236920 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:28:03 crc kubenswrapper[4749]: I1129 02:28:03.329033 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mm2fh"] Nov 29 02:28:05 crc kubenswrapper[4749]: I1129 02:28:05.176555 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mm2fh" podUID="47ac71b9-98bf-4241-832a-293182733c7a" containerName="registry-server" containerID="cri-o://a429184640821604370950b48bb7c12b4070ba63e486816d323de9773bdfe0cc" gracePeriod=2 Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.204290 4749 generic.go:334] "Generic (PLEG): container finished" podID="47ac71b9-98bf-4241-832a-293182733c7a" containerID="a429184640821604370950b48bb7c12b4070ba63e486816d323de9773bdfe0cc" exitCode=0 Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.204431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm2fh" event={"ID":"47ac71b9-98bf-4241-832a-293182733c7a","Type":"ContainerDied","Data":"a429184640821604370950b48bb7c12b4070ba63e486816d323de9773bdfe0cc"} Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.580983 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.604971 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ddv87"] Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.621652 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ddv87"] Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.735965 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-jshbs"] Nov 29 02:28:07 crc kubenswrapper[4749]: E1129 02:28:07.736543 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ac71b9-98bf-4241-832a-293182733c7a" containerName="registry-server" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.736577 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ac71b9-98bf-4241-832a-293182733c7a" containerName="registry-server" Nov 29 02:28:07 crc kubenswrapper[4749]: E1129 02:28:07.736625 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ac71b9-98bf-4241-832a-293182733c7a" containerName="extract-utilities" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.736639 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ac71b9-98bf-4241-832a-293182733c7a" containerName="extract-utilities" Nov 29 02:28:07 crc kubenswrapper[4749]: E1129 02:28:07.736672 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47ac71b9-98bf-4241-832a-293182733c7a" containerName="extract-content" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.736687 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="47ac71b9-98bf-4241-832a-293182733c7a" containerName="extract-content" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.736942 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="47ac71b9-98bf-4241-832a-293182733c7a" containerName="registry-server" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.737812 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.738261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-catalog-content\") pod \"47ac71b9-98bf-4241-832a-293182733c7a\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.738449 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-utilities\") pod \"47ac71b9-98bf-4241-832a-293182733c7a\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.738522 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrhcn\" (UniqueName: \"kubernetes.io/projected/47ac71b9-98bf-4241-832a-293182733c7a-kube-api-access-nrhcn\") pod \"47ac71b9-98bf-4241-832a-293182733c7a\" (UID: \"47ac71b9-98bf-4241-832a-293182733c7a\") " Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.741099 4749 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-cnnk7" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.741697 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.742165 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.744281 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.748847 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jshbs"] Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.749223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-utilities" (OuterVolumeSpecName: "utilities") pod "47ac71b9-98bf-4241-832a-293182733c7a" (UID: "47ac71b9-98bf-4241-832a-293182733c7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.753879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47ac71b9-98bf-4241-832a-293182733c7a-kube-api-access-nrhcn" (OuterVolumeSpecName: "kube-api-access-nrhcn") pod "47ac71b9-98bf-4241-832a-293182733c7a" (UID: "47ac71b9-98bf-4241-832a-293182733c7a"). InnerVolumeSpecName "kube-api-access-nrhcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.840432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d192d81-2dfb-442c-98e2-ea29311de955-crc-storage\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.840506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zfjl\" (UniqueName: \"kubernetes.io/projected/6d192d81-2dfb-442c-98e2-ea29311de955-kube-api-access-2zfjl\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.840601 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d192d81-2dfb-442c-98e2-ea29311de955-node-mnt\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.840790 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.840821 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrhcn\" (UniqueName: \"kubernetes.io/projected/47ac71b9-98bf-4241-832a-293182733c7a-kube-api-access-nrhcn\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.899216 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47ac71b9-98bf-4241-832a-293182733c7a" (UID: "47ac71b9-98bf-4241-832a-293182733c7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.942558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d192d81-2dfb-442c-98e2-ea29311de955-crc-storage\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.942654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zfjl\" (UniqueName: \"kubernetes.io/projected/6d192d81-2dfb-442c-98e2-ea29311de955-kube-api-access-2zfjl\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.942767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d192d81-2dfb-442c-98e2-ea29311de955-node-mnt\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.943000 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47ac71b9-98bf-4241-832a-293182733c7a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.943470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d192d81-2dfb-442c-98e2-ea29311de955-node-mnt\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.944191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d192d81-2dfb-442c-98e2-ea29311de955-crc-storage\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:07 crc kubenswrapper[4749]: I1129 02:28:07.964486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zfjl\" (UniqueName: \"kubernetes.io/projected/6d192d81-2dfb-442c-98e2-ea29311de955-kube-api-access-2zfjl\") pod \"crc-storage-crc-jshbs\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.075982 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.226664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mm2fh" event={"ID":"47ac71b9-98bf-4241-832a-293182733c7a","Type":"ContainerDied","Data":"c29ddda5bc1e2b57c36f3b238e5fc9ef04a416b86faa924ad3fc5b15ec4cefa0"} Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.226755 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mm2fh" Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.226773 4749 scope.go:117] "RemoveContainer" containerID="a429184640821604370950b48bb7c12b4070ba63e486816d323de9773bdfe0cc" Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.258248 4749 scope.go:117] "RemoveContainer" containerID="5c45a5e01034ab40a202dcd5f18f6bb1688d6f7ea2250a645a5e7e362df3d211" Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.290708 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mm2fh"] Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.302734 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mm2fh"] Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.311069 4749 scope.go:117] "RemoveContainer" containerID="0c49c573d0b6e67b6360e728276c12802ddc7957e7cd553de061cfcf2a644013" Nov 29 02:28:08 crc kubenswrapper[4749]: I1129 02:28:08.617048 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-jshbs"] Nov 29 02:28:08 crc kubenswrapper[4749]: W1129 02:28:08.622257 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d192d81_2dfb_442c_98e2_ea29311de955.slice/crio-e955a6aac4914f33473246d68ce37c3e3f171ee6dc660275a23d6a0fbfae6215 WatchSource:0}: Error finding container e955a6aac4914f33473246d68ce37c3e3f171ee6dc660275a23d6a0fbfae6215: Status 404 returned error can't find the container with id e955a6aac4914f33473246d68ce37c3e3f171ee6dc660275a23d6a0fbfae6215 Nov 29 02:28:09 crc kubenswrapper[4749]: I1129 02:28:09.093110 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47ac71b9-98bf-4241-832a-293182733c7a" path="/var/lib/kubelet/pods/47ac71b9-98bf-4241-832a-293182733c7a/volumes" Nov 29 02:28:09 crc kubenswrapper[4749]: I1129 02:28:09.096061 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8d4b7b-6cf6-48c5-89d9-595c9b835306" path="/var/lib/kubelet/pods/fb8d4b7b-6cf6-48c5-89d9-595c9b835306/volumes" Nov 29 02:28:09 crc kubenswrapper[4749]: I1129 02:28:09.237307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jshbs" event={"ID":"6d192d81-2dfb-442c-98e2-ea29311de955","Type":"ContainerStarted","Data":"e955a6aac4914f33473246d68ce37c3e3f171ee6dc660275a23d6a0fbfae6215"} Nov 29 02:28:10 crc kubenswrapper[4749]: I1129 02:28:10.252016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jshbs" event={"ID":"6d192d81-2dfb-442c-98e2-ea29311de955","Type":"ContainerStarted","Data":"ffa33d9e4d0351fb4f2c8a7d81a5e2c55b6d6ce9fd8ee92124ba30e011b039fc"} Nov 29 02:28:10 crc kubenswrapper[4749]: I1129 02:28:10.277703 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-jshbs" podStartSLOduration=2.203895804 podStartE2EDuration="3.277671616s" podCreationTimestamp="2025-11-29 02:28:07 +0000 UTC" firstStartedPulling="2025-11-29 02:28:08.626262251 +0000 UTC m=+4631.798412148" lastFinishedPulling="2025-11-29 02:28:09.700038073 +0000 UTC m=+4632.872187960" observedRunningTime="2025-11-29 02:28:10.275823671 +0000 UTC m=+4633.447973578" watchObservedRunningTime="2025-11-29 02:28:10.277671616 +0000 UTC m=+4633.449821543" Nov 29 02:28:11 crc kubenswrapper[4749]: I1129 02:28:11.265980 4749 generic.go:334] "Generic (PLEG): container finished" podID="6d192d81-2dfb-442c-98e2-ea29311de955" containerID="ffa33d9e4d0351fb4f2c8a7d81a5e2c55b6d6ce9fd8ee92124ba30e011b039fc" exitCode=0 Nov 29 02:28:11 crc kubenswrapper[4749]: I1129 02:28:11.266044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jshbs" event={"ID":"6d192d81-2dfb-442c-98e2-ea29311de955","Type":"ContainerDied","Data":"ffa33d9e4d0351fb4f2c8a7d81a5e2c55b6d6ce9fd8ee92124ba30e011b039fc"} Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.676768 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.820170 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d192d81-2dfb-442c-98e2-ea29311de955-node-mnt\") pod \"6d192d81-2dfb-442c-98e2-ea29311de955\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.820271 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d192d81-2dfb-442c-98e2-ea29311de955-crc-storage\") pod \"6d192d81-2dfb-442c-98e2-ea29311de955\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.820295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zfjl\" (UniqueName: \"kubernetes.io/projected/6d192d81-2dfb-442c-98e2-ea29311de955-kube-api-access-2zfjl\") pod \"6d192d81-2dfb-442c-98e2-ea29311de955\" (UID: \"6d192d81-2dfb-442c-98e2-ea29311de955\") " Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.820294 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d192d81-2dfb-442c-98e2-ea29311de955-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6d192d81-2dfb-442c-98e2-ea29311de955" (UID: "6d192d81-2dfb-442c-98e2-ea29311de955"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.820622 4749 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6d192d81-2dfb-442c-98e2-ea29311de955-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.832248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d192d81-2dfb-442c-98e2-ea29311de955-kube-api-access-2zfjl" (OuterVolumeSpecName: "kube-api-access-2zfjl") pod "6d192d81-2dfb-442c-98e2-ea29311de955" (UID: "6d192d81-2dfb-442c-98e2-ea29311de955"). InnerVolumeSpecName "kube-api-access-2zfjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.856066 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d192d81-2dfb-442c-98e2-ea29311de955-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6d192d81-2dfb-442c-98e2-ea29311de955" (UID: "6d192d81-2dfb-442c-98e2-ea29311de955"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.922299 4749 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6d192d81-2dfb-442c-98e2-ea29311de955-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:12 crc kubenswrapper[4749]: I1129 02:28:12.922335 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zfjl\" (UniqueName: \"kubernetes.io/projected/6d192d81-2dfb-442c-98e2-ea29311de955-kube-api-access-2zfjl\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:13 crc kubenswrapper[4749]: I1129 02:28:13.228595 4749 scope.go:117] "RemoveContainer" containerID="fe6938cd9b4daee386b0b15740a94067610d45d768906bf777a57e67ae1e4756" Nov 29 02:28:13 crc kubenswrapper[4749]: I1129 02:28:13.292749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-jshbs" event={"ID":"6d192d81-2dfb-442c-98e2-ea29311de955","Type":"ContainerDied","Data":"e955a6aac4914f33473246d68ce37c3e3f171ee6dc660275a23d6a0fbfae6215"} Nov 29 02:28:13 crc kubenswrapper[4749]: I1129 02:28:13.292908 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-jshbs" Nov 29 02:28:13 crc kubenswrapper[4749]: I1129 02:28:13.294188 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e955a6aac4914f33473246d68ce37c3e3f171ee6dc660275a23d6a0fbfae6215" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.764371 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-jshbs"] Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.775532 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-jshbs"] Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.903867 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-d8hkh"] Nov 29 02:28:14 crc kubenswrapper[4749]: E1129 02:28:14.904476 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d192d81-2dfb-442c-98e2-ea29311de955" containerName="storage" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.904509 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d192d81-2dfb-442c-98e2-ea29311de955" containerName="storage" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.904801 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d192d81-2dfb-442c-98e2-ea29311de955" containerName="storage" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.905568 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.907832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-cnnk7" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.910711 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.911037 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.913180 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 29 02:28:14 crc kubenswrapper[4749]: I1129 02:28:14.916870 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-d8hkh"] Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.058345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/03c5c027-177c-4f13-a028-f2cac8ab223e-node-mnt\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.058457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/03c5c027-177c-4f13-a028-f2cac8ab223e-crc-storage\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.058609 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2tqv\" (UniqueName: \"kubernetes.io/projected/03c5c027-177c-4f13-a028-f2cac8ab223e-kube-api-access-n2tqv\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.091920 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d192d81-2dfb-442c-98e2-ea29311de955" path="/var/lib/kubelet/pods/6d192d81-2dfb-442c-98e2-ea29311de955/volumes" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.160085 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/03c5c027-177c-4f13-a028-f2cac8ab223e-node-mnt\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.160564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/03c5c027-177c-4f13-a028-f2cac8ab223e-crc-storage\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.160829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2tqv\" (UniqueName: \"kubernetes.io/projected/03c5c027-177c-4f13-a028-f2cac8ab223e-kube-api-access-n2tqv\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.160638 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/03c5c027-177c-4f13-a028-f2cac8ab223e-node-mnt\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.162095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/03c5c027-177c-4f13-a028-f2cac8ab223e-crc-storage\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.193534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2tqv\" (UniqueName: \"kubernetes.io/projected/03c5c027-177c-4f13-a028-f2cac8ab223e-kube-api-access-n2tqv\") pod \"crc-storage-crc-d8hkh\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.241301 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:15 crc kubenswrapper[4749]: I1129 02:28:15.764910 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-d8hkh"] Nov 29 02:28:15 crc kubenswrapper[4749]: W1129 02:28:15.778718 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03c5c027_177c_4f13_a028_f2cac8ab223e.slice/crio-3e33cd900399ac12204ed9a56820502339f4becd845e445202a1eabe2724ff77 WatchSource:0}: Error finding container 3e33cd900399ac12204ed9a56820502339f4becd845e445202a1eabe2724ff77: Status 404 returned error can't find the container with id 3e33cd900399ac12204ed9a56820502339f4becd845e445202a1eabe2724ff77 Nov 29 02:28:16 crc kubenswrapper[4749]: I1129 02:28:16.338187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d8hkh" event={"ID":"03c5c027-177c-4f13-a028-f2cac8ab223e","Type":"ContainerStarted","Data":"3e33cd900399ac12204ed9a56820502339f4becd845e445202a1eabe2724ff77"} Nov 29 02:28:17 crc kubenswrapper[4749]: I1129 02:28:17.352559 4749 generic.go:334] "Generic (PLEG): container finished" podID="03c5c027-177c-4f13-a028-f2cac8ab223e" containerID="635d759f2288ed0901ea772685554affb777db5cf35ea89a97cdda7355b323ef" exitCode=0 Nov 29 02:28:17 crc kubenswrapper[4749]: I1129 02:28:17.352642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d8hkh" event={"ID":"03c5c027-177c-4f13-a028-f2cac8ab223e","Type":"ContainerDied","Data":"635d759f2288ed0901ea772685554affb777db5cf35ea89a97cdda7355b323ef"} Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.750492 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.825825 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/03c5c027-177c-4f13-a028-f2cac8ab223e-node-mnt\") pod \"03c5c027-177c-4f13-a028-f2cac8ab223e\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.825925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2tqv\" (UniqueName: \"kubernetes.io/projected/03c5c027-177c-4f13-a028-f2cac8ab223e-kube-api-access-n2tqv\") pod \"03c5c027-177c-4f13-a028-f2cac8ab223e\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.826393 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/03c5c027-177c-4f13-a028-f2cac8ab223e-crc-storage\") pod \"03c5c027-177c-4f13-a028-f2cac8ab223e\" (UID: \"03c5c027-177c-4f13-a028-f2cac8ab223e\") " Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.825929 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03c5c027-177c-4f13-a028-f2cac8ab223e-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "03c5c027-177c-4f13-a028-f2cac8ab223e" (UID: "03c5c027-177c-4f13-a028-f2cac8ab223e"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.841340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03c5c027-177c-4f13-a028-f2cac8ab223e-kube-api-access-n2tqv" (OuterVolumeSpecName: "kube-api-access-n2tqv") pod "03c5c027-177c-4f13-a028-f2cac8ab223e" (UID: "03c5c027-177c-4f13-a028-f2cac8ab223e"). InnerVolumeSpecName "kube-api-access-n2tqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.853993 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03c5c027-177c-4f13-a028-f2cac8ab223e-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "03c5c027-177c-4f13-a028-f2cac8ab223e" (UID: "03c5c027-177c-4f13-a028-f2cac8ab223e"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.930000 4749 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/03c5c027-177c-4f13-a028-f2cac8ab223e-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.930101 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2tqv\" (UniqueName: \"kubernetes.io/projected/03c5c027-177c-4f13-a028-f2cac8ab223e-kube-api-access-n2tqv\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:18 crc kubenswrapper[4749]: I1129 02:28:18.930125 4749 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/03c5c027-177c-4f13-a028-f2cac8ab223e-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 29 02:28:19 crc kubenswrapper[4749]: I1129 02:28:19.370670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d8hkh" event={"ID":"03c5c027-177c-4f13-a028-f2cac8ab223e","Type":"ContainerDied","Data":"3e33cd900399ac12204ed9a56820502339f4becd845e445202a1eabe2724ff77"} Nov 29 02:28:19 crc kubenswrapper[4749]: I1129 02:28:19.370732 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e33cd900399ac12204ed9a56820502339f4becd845e445202a1eabe2724ff77" Nov 29 02:28:19 crc kubenswrapper[4749]: I1129 02:28:19.370837 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d8hkh" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.440891 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5kttc"] Nov 29 02:29:42 crc kubenswrapper[4749]: E1129 02:29:42.441904 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03c5c027-177c-4f13-a028-f2cac8ab223e" containerName="storage" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.441926 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="03c5c027-177c-4f13-a028-f2cac8ab223e" containerName="storage" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.442264 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="03c5c027-177c-4f13-a028-f2cac8ab223e" containerName="storage" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.444989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.462996 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kttc"] Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.484583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-catalog-content\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.484671 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58hgq\" (UniqueName: \"kubernetes.io/projected/db46aef9-9e07-4a7a-acda-f28f78e245d8-kube-api-access-58hgq\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.484732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-utilities\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.586444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-utilities\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.586838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-catalog-content\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.586905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58hgq\" (UniqueName: \"kubernetes.io/projected/db46aef9-9e07-4a7a-acda-f28f78e245d8-kube-api-access-58hgq\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.587097 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-utilities\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.587803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-catalog-content\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.612347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58hgq\" (UniqueName: \"kubernetes.io/projected/db46aef9-9e07-4a7a-acda-f28f78e245d8-kube-api-access-58hgq\") pod \"community-operators-5kttc\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:42 crc kubenswrapper[4749]: I1129 02:29:42.810906 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:43 crc kubenswrapper[4749]: I1129 02:29:43.294925 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kttc"] Nov 29 02:29:44 crc kubenswrapper[4749]: I1129 02:29:44.215639 4749 generic.go:334] "Generic (PLEG): container finished" podID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerID="f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6" exitCode=0 Nov 29 02:29:44 crc kubenswrapper[4749]: I1129 02:29:44.215725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kttc" event={"ID":"db46aef9-9e07-4a7a-acda-f28f78e245d8","Type":"ContainerDied","Data":"f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6"} Nov 29 02:29:44 crc kubenswrapper[4749]: I1129 02:29:44.215985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kttc" event={"ID":"db46aef9-9e07-4a7a-acda-f28f78e245d8","Type":"ContainerStarted","Data":"80505b47a4d90b38996ca9471720788a08e9577d97d237e8e499ed7c67eab681"} Nov 29 02:29:44 crc kubenswrapper[4749]: I1129 02:29:44.218834 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 02:29:46 crc kubenswrapper[4749]: I1129 02:29:46.236190 4749 generic.go:334] "Generic (PLEG): container finished" podID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerID="b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464" exitCode=0 Nov 29 02:29:46 crc kubenswrapper[4749]: I1129 02:29:46.236386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kttc" event={"ID":"db46aef9-9e07-4a7a-acda-f28f78e245d8","Type":"ContainerDied","Data":"b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464"} Nov 29 02:29:47 crc kubenswrapper[4749]: I1129 02:29:47.252267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kttc" event={"ID":"db46aef9-9e07-4a7a-acda-f28f78e245d8","Type":"ContainerStarted","Data":"ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55"} Nov 29 02:29:47 crc kubenswrapper[4749]: I1129 02:29:47.284193 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5kttc" podStartSLOduration=2.578988077 podStartE2EDuration="5.284166891s" podCreationTimestamp="2025-11-29 02:29:42 +0000 UTC" firstStartedPulling="2025-11-29 02:29:44.218557249 +0000 UTC m=+4727.390707106" lastFinishedPulling="2025-11-29 02:29:46.923736053 +0000 UTC m=+4730.095885920" observedRunningTime="2025-11-29 02:29:47.273930304 +0000 UTC m=+4730.446080191" watchObservedRunningTime="2025-11-29 02:29:47.284166891 +0000 UTC m=+4730.456316788" Nov 29 02:29:52 crc kubenswrapper[4749]: I1129 02:29:52.812484 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:52 crc kubenswrapper[4749]: I1129 02:29:52.813544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:52 crc kubenswrapper[4749]: I1129 02:29:52.885747 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:53 crc kubenswrapper[4749]: I1129 02:29:53.378890 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:53 crc kubenswrapper[4749]: I1129 02:29:53.445578 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5kttc"] Nov 29 02:29:55 crc kubenswrapper[4749]: I1129 02:29:55.324298 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5kttc" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerName="registry-server" containerID="cri-o://ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55" gracePeriod=2 Nov 29 02:29:55 crc kubenswrapper[4749]: I1129 02:29:55.774371 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:55 crc kubenswrapper[4749]: I1129 02:29:55.908015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58hgq\" (UniqueName: \"kubernetes.io/projected/db46aef9-9e07-4a7a-acda-f28f78e245d8-kube-api-access-58hgq\") pod \"db46aef9-9e07-4a7a-acda-f28f78e245d8\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " Nov 29 02:29:55 crc kubenswrapper[4749]: I1129 02:29:55.908151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-catalog-content\") pod \"db46aef9-9e07-4a7a-acda-f28f78e245d8\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " Nov 29 02:29:55 crc kubenswrapper[4749]: I1129 02:29:55.908266 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-utilities\") pod \"db46aef9-9e07-4a7a-acda-f28f78e245d8\" (UID: \"db46aef9-9e07-4a7a-acda-f28f78e245d8\") " Nov 29 02:29:55 crc kubenswrapper[4749]: I1129 02:29:55.911564 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-utilities" (OuterVolumeSpecName: "utilities") pod "db46aef9-9e07-4a7a-acda-f28f78e245d8" (UID: "db46aef9-9e07-4a7a-acda-f28f78e245d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:29:55 crc kubenswrapper[4749]: I1129 02:29:55.917668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db46aef9-9e07-4a7a-acda-f28f78e245d8-kube-api-access-58hgq" (OuterVolumeSpecName: "kube-api-access-58hgq") pod "db46aef9-9e07-4a7a-acda-f28f78e245d8" (UID: "db46aef9-9e07-4a7a-acda-f28f78e245d8"). InnerVolumeSpecName "kube-api-access-58hgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.010617 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.010668 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58hgq\" (UniqueName: \"kubernetes.io/projected/db46aef9-9e07-4a7a-acda-f28f78e245d8-kube-api-access-58hgq\") on node \"crc\" DevicePath \"\"" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.205834 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db46aef9-9e07-4a7a-acda-f28f78e245d8" (UID: "db46aef9-9e07-4a7a-acda-f28f78e245d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.213991 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db46aef9-9e07-4a7a-acda-f28f78e245d8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.354724 4749 generic.go:334] "Generic (PLEG): container finished" podID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerID="ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55" exitCode=0 Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.354785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kttc" event={"ID":"db46aef9-9e07-4a7a-acda-f28f78e245d8","Type":"ContainerDied","Data":"ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55"} Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.354824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kttc" event={"ID":"db46aef9-9e07-4a7a-acda-f28f78e245d8","Type":"ContainerDied","Data":"80505b47a4d90b38996ca9471720788a08e9577d97d237e8e499ed7c67eab681"} Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.354856 4749 scope.go:117] "RemoveContainer" containerID="ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.355041 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kttc" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.403425 4749 scope.go:117] "RemoveContainer" containerID="b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.414571 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5kttc"] Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.424430 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5kttc"] Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.428978 4749 scope.go:117] "RemoveContainer" containerID="f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.462158 4749 scope.go:117] "RemoveContainer" containerID="ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55" Nov 29 02:29:56 crc kubenswrapper[4749]: E1129 02:29:56.462853 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55\": container with ID starting with ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55 not found: ID does not exist" containerID="ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.462903 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55"} err="failed to get container status \"ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55\": rpc error: code = NotFound desc = could not find container \"ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55\": container with ID starting with ab1337763c63930629fc3512a906fa553045254409b88cbb5f9d728beae2bd55 not found: ID does not exist" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.462935 4749 scope.go:117] "RemoveContainer" containerID="b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464" Nov 29 02:29:56 crc kubenswrapper[4749]: E1129 02:29:56.463784 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464\": container with ID starting with b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464 not found: ID does not exist" containerID="b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.463832 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464"} err="failed to get container status \"b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464\": rpc error: code = NotFound desc = could not find container \"b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464\": container with ID starting with b6f774d5d810d229e46db16f3a6d5fcaea67a6364d327f0180ce2b1d9d283464 not found: ID does not exist" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.463856 4749 scope.go:117] "RemoveContainer" containerID="f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6" Nov 29 02:29:56 crc kubenswrapper[4749]: E1129 02:29:56.464355 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6\": container with ID starting with f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6 not found: ID does not exist" containerID="f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6" Nov 29 02:29:56 crc kubenswrapper[4749]: I1129 02:29:56.464396 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6"} err="failed to get container status \"f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6\": rpc error: code = NotFound desc = could not find container \"f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6\": container with ID starting with f433a4eb2a22d5b8c2e9c2e5007e5a0e58dfbc746d880b00b4e307a0a21fd9b6 not found: ID does not exist" Nov 29 02:29:57 crc kubenswrapper[4749]: I1129 02:29:57.086815 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" path="/var/lib/kubelet/pods/db46aef9-9e07-4a7a-acda-f28f78e245d8/volumes" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.173799 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh"] Nov 29 02:30:00 crc kubenswrapper[4749]: E1129 02:30:00.174613 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerName="extract-utilities" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.174636 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerName="extract-utilities" Nov 29 02:30:00 crc kubenswrapper[4749]: E1129 02:30:00.174670 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerName="extract-content" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.174681 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerName="extract-content" Nov 29 02:30:00 crc kubenswrapper[4749]: E1129 02:30:00.174717 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerName="registry-server" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.174729 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerName="registry-server" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.174966 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="db46aef9-9e07-4a7a-acda-f28f78e245d8" containerName="registry-server" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.175676 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.186080 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh"] Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.206631 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.206676 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.309035 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdhcw\" (UniqueName: \"kubernetes.io/projected/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-kube-api-access-jdhcw\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.309087 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-secret-volume\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.309262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-config-volume\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.410601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-secret-volume\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.410752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-config-volume\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.410872 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdhcw\" (UniqueName: \"kubernetes.io/projected/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-kube-api-access-jdhcw\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.413463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-config-volume\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.600450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-secret-volume\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.602586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdhcw\" (UniqueName: \"kubernetes.io/projected/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-kube-api-access-jdhcw\") pod \"collect-profiles-29406390-mvqkh\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:00 crc kubenswrapper[4749]: I1129 02:30:00.834575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:01 crc kubenswrapper[4749]: I1129 02:30:01.090942 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh"] Nov 29 02:30:01 crc kubenswrapper[4749]: I1129 02:30:01.412780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" event={"ID":"f1cf6c0f-36d8-4b64-89ae-55d85218f65a","Type":"ContainerStarted","Data":"b2d0afbb1d7a1a3ffbaf6b6a7bb0c632e68165c992edc50dafd45ec5213467ee"} Nov 29 02:30:01 crc kubenswrapper[4749]: I1129 02:30:01.412845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" event={"ID":"f1cf6c0f-36d8-4b64-89ae-55d85218f65a","Type":"ContainerStarted","Data":"c08a57b75be21265ee1f4ed39e29eb82a871aa81e2b9c8cca95cf80c078aac0c"} Nov 29 02:30:01 crc kubenswrapper[4749]: I1129 02:30:01.431271 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" podStartSLOduration=1.431245765 podStartE2EDuration="1.431245765s" podCreationTimestamp="2025-11-29 02:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:30:01.427405683 +0000 UTC m=+4744.599555610" watchObservedRunningTime="2025-11-29 02:30:01.431245765 +0000 UTC m=+4744.603395652" Nov 29 02:30:02 crc kubenswrapper[4749]: I1129 02:30:02.426889 4749 generic.go:334] "Generic (PLEG): container finished" podID="f1cf6c0f-36d8-4b64-89ae-55d85218f65a" containerID="b2d0afbb1d7a1a3ffbaf6b6a7bb0c632e68165c992edc50dafd45ec5213467ee" exitCode=0 Nov 29 02:30:02 crc kubenswrapper[4749]: I1129 02:30:02.427029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" event={"ID":"f1cf6c0f-36d8-4b64-89ae-55d85218f65a","Type":"ContainerDied","Data":"b2d0afbb1d7a1a3ffbaf6b6a7bb0c632e68165c992edc50dafd45ec5213467ee"} Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.756772 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.861695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdhcw\" (UniqueName: \"kubernetes.io/projected/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-kube-api-access-jdhcw\") pod \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.861836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-secret-volume\") pod \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.861879 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-config-volume\") pod \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\" (UID: \"f1cf6c0f-36d8-4b64-89ae-55d85218f65a\") " Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.862598 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-config-volume" (OuterVolumeSpecName: "config-volume") pod "f1cf6c0f-36d8-4b64-89ae-55d85218f65a" (UID: "f1cf6c0f-36d8-4b64-89ae-55d85218f65a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.866938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f1cf6c0f-36d8-4b64-89ae-55d85218f65a" (UID: "f1cf6c0f-36d8-4b64-89ae-55d85218f65a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.866960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-kube-api-access-jdhcw" (OuterVolumeSpecName: "kube-api-access-jdhcw") pod "f1cf6c0f-36d8-4b64-89ae-55d85218f65a" (UID: "f1cf6c0f-36d8-4b64-89ae-55d85218f65a"). InnerVolumeSpecName "kube-api-access-jdhcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.963878 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.963919 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 02:30:03 crc kubenswrapper[4749]: I1129 02:30:03.963933 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdhcw\" (UniqueName: \"kubernetes.io/projected/f1cf6c0f-36d8-4b64-89ae-55d85218f65a-kube-api-access-jdhcw\") on node \"crc\" DevicePath \"\"" Nov 29 02:30:04 crc kubenswrapper[4749]: I1129 02:30:04.452427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" event={"ID":"f1cf6c0f-36d8-4b64-89ae-55d85218f65a","Type":"ContainerDied","Data":"c08a57b75be21265ee1f4ed39e29eb82a871aa81e2b9c8cca95cf80c078aac0c"} Nov 29 02:30:04 crc kubenswrapper[4749]: I1129 02:30:04.452711 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c08a57b75be21265ee1f4ed39e29eb82a871aa81e2b9c8cca95cf80c078aac0c" Nov 29 02:30:04 crc kubenswrapper[4749]: I1129 02:30:04.452489 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh" Nov 29 02:30:04 crc kubenswrapper[4749]: I1129 02:30:04.506132 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9"] Nov 29 02:30:04 crc kubenswrapper[4749]: I1129 02:30:04.512339 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406345-kf6j9"] Nov 29 02:30:05 crc kubenswrapper[4749]: I1129 02:30:05.091388 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4960404c-fe87-49b2-8574-c6f268509899" path="/var/lib/kubelet/pods/4960404c-fe87-49b2-8574-c6f268509899/volumes" Nov 29 02:30:13 crc kubenswrapper[4749]: I1129 02:30:13.350969 4749 scope.go:117] "RemoveContainer" containerID="77a978578c0c8b13e73bbf481b17d6950413a06b2fa1b645983728425088d52d" Nov 29 02:30:25 crc kubenswrapper[4749]: I1129 02:30:25.375061 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:30:25 crc kubenswrapper[4749]: I1129 02:30:25.375924 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:30:55 crc kubenswrapper[4749]: I1129 02:30:55.374006 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:30:55 crc kubenswrapper[4749]: I1129 02:30:55.374772 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:31:25 crc kubenswrapper[4749]: I1129 02:31:25.374538 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:31:25 crc kubenswrapper[4749]: I1129 02:31:25.375060 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:31:25 crc kubenswrapper[4749]: I1129 02:31:25.375114 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:31:25 crc kubenswrapper[4749]: I1129 02:31:25.375862 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df2ae3add145d734c1564dba300d08f14f65ca639c4b0a89cde4ceb1f506a682"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:31:25 crc kubenswrapper[4749]: I1129 02:31:25.375930 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://df2ae3add145d734c1564dba300d08f14f65ca639c4b0a89cde4ceb1f506a682" gracePeriod=600 Nov 29 02:31:26 crc kubenswrapper[4749]: I1129 02:31:26.432336 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="df2ae3add145d734c1564dba300d08f14f65ca639c4b0a89cde4ceb1f506a682" exitCode=0 Nov 29 02:31:26 crc kubenswrapper[4749]: I1129 02:31:26.432426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"df2ae3add145d734c1564dba300d08f14f65ca639c4b0a89cde4ceb1f506a682"} Nov 29 02:31:26 crc kubenswrapper[4749]: I1129 02:31:26.432942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24"} Nov 29 02:31:26 crc kubenswrapper[4749]: I1129 02:31:26.432962 4749 scope.go:117] "RemoveContainer" containerID="a15b49edb31cdd95dba8b7f85372b6108f2242e6729a533befb6299c006a449c" Nov 29 02:31:45 crc kubenswrapper[4749]: I1129 02:31:45.992966 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-n82s7"] Nov 29 02:31:45 crc kubenswrapper[4749]: E1129 02:31:45.994939 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cf6c0f-36d8-4b64-89ae-55d85218f65a" containerName="collect-profiles" Nov 29 02:31:45 crc kubenswrapper[4749]: I1129 02:31:45.995021 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cf6c0f-36d8-4b64-89ae-55d85218f65a" containerName="collect-profiles" Nov 29 02:31:45 crc kubenswrapper[4749]: I1129 02:31:45.995239 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1cf6c0f-36d8-4b64-89ae-55d85218f65a" containerName="collect-profiles" Nov 29 02:31:45 crc kubenswrapper[4749]: I1129 02:31:45.995986 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:45 crc kubenswrapper[4749]: I1129 02:31:45.998298 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 29 02:31:45 crc kubenswrapper[4749]: I1129 02:31:45.999490 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 29 02:31:45 crc kubenswrapper[4749]: I1129 02:31:45.999585 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cmm4w" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.007594 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.008250 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-n82s7"] Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.010751 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.082899 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v7nk\" (UniqueName: \"kubernetes.io/projected/18dad77e-04e7-4b89-a24f-db71c3ac78ec-kube-api-access-8v7nk\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.083326 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.083364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-config\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.185536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v7nk\" (UniqueName: \"kubernetes.io/projected/18dad77e-04e7-4b89-a24f-db71c3ac78ec-kube-api-access-8v7nk\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.185960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.186557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-config\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.186972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.187109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-config\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.239036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v7nk\" (UniqueName: \"kubernetes.io/projected/18dad77e-04e7-4b89-a24f-db71c3ac78ec-kube-api-access-8v7nk\") pod \"dnsmasq-dns-5d7b5456f5-n82s7\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.279526 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-jj8gj"] Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.280764 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.287997 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-jj8gj"] Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.310059 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.388318 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgwc2\" (UniqueName: \"kubernetes.io/projected/2915b260-1e9b-4711-bbcb-0f04e4385aa0-kube-api-access-pgwc2\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.388400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-config\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.388518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.490330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.490638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgwc2\" (UniqueName: \"kubernetes.io/projected/2915b260-1e9b-4711-bbcb-0f04e4385aa0-kube-api-access-pgwc2\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.490703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-config\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.491164 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.491529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-config\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.509323 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgwc2\" (UniqueName: \"kubernetes.io/projected/2915b260-1e9b-4711-bbcb-0f04e4385aa0-kube-api-access-pgwc2\") pod \"dnsmasq-dns-98ddfc8f-jj8gj\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.598251 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:46 crc kubenswrapper[4749]: I1129 02:31:46.771101 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-n82s7"] Nov 29 02:31:46 crc kubenswrapper[4749]: W1129 02:31:46.779463 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18dad77e_04e7_4b89_a24f_db71c3ac78ec.slice/crio-9115aa6d178fe6e6a742a0e3f29599598aa1f09a18dd67c235292d5a7e7cb746 WatchSource:0}: Error finding container 9115aa6d178fe6e6a742a0e3f29599598aa1f09a18dd67c235292d5a7e7cb746: Status 404 returned error can't find the container with id 9115aa6d178fe6e6a742a0e3f29599598aa1f09a18dd67c235292d5a7e7cb746 Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.025286 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-jj8gj"] Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.154485 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.156158 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.158763 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.158995 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.159025 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fdkdr" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.160302 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.160558 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.180380 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.307629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6695b886-1903-4ce0-a4d6-c0b230522074-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.307699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.307773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.307816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.307874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.307924 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6695b886-1903-4ce0-a4d6-c0b230522074-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.307998 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6fnz\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-kube-api-access-t6fnz\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.308031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.308074 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.408369 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.408942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.408985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.409017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6695b886-1903-4ce0-a4d6-c0b230522074-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.409052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6fnz\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-kube-api-access-t6fnz\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.409070 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.409091 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.409120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6695b886-1903-4ce0-a4d6-c0b230522074-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.409138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.409169 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.409931 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.410231 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.410321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.410608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.410982 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.412899 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.413124 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kxh5r" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.413300 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.413638 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.413769 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.415662 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.415700 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fb30d21bff812042c3ba73d535733d4a541c8291c9170e06c00ec0a2b33d56dc/globalmount\"" pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.437164 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.502670 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6695b886-1903-4ce0-a4d6-c0b230522074-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.509703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6695b886-1903-4ce0-a4d6-c0b230522074-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.510522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.510565 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q52v\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-kube-api-access-4q52v\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.510593 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.510874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.511045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.511163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.511489 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.511570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.511602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.512000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.521917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6fnz\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-kube-api-access-t6fnz\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.531285 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"rabbitmq-server-0\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q52v\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-kube-api-access-4q52v\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615297 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.615368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.619901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.620969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.621907 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.622009 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c51c2b13dbf0b9756210c6f61ae1019431eed3ed4124bcd7795448ba4b882c80/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.622094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.622852 4749 generic.go:334] "Generic (PLEG): container finished" podID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" containerID="4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691" exitCode=0 Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.622906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" event={"ID":"18dad77e-04e7-4b89-a24f-db71c3ac78ec","Type":"ContainerDied","Data":"4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691"} Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.622929 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" event={"ID":"18dad77e-04e7-4b89-a24f-db71c3ac78ec","Type":"ContainerStarted","Data":"9115aa6d178fe6e6a742a0e3f29599598aa1f09a18dd67c235292d5a7e7cb746"} Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.623128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.624081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" event={"ID":"2915b260-1e9b-4711-bbcb-0f04e4385aa0","Type":"ContainerStarted","Data":"73b20c646271be95ec0b35047c5f80ba9a1553c5e96fd4a795350f1b564f5b20"} Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.625165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.628708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.635538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.640112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q52v\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-kube-api-access-4q52v\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.657563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"rabbitmq-cell1-server-0\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.702978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: I1129 02:31:47.791745 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 02:31:47 crc kubenswrapper[4749]: E1129 02:31:47.919333 4749 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Nov 29 02:31:47 crc kubenswrapper[4749]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/18dad77e-04e7-4b89-a24f-db71c3ac78ec/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 29 02:31:47 crc kubenswrapper[4749]: > podSandboxID="9115aa6d178fe6e6a742a0e3f29599598aa1f09a18dd67c235292d5a7e7cb746" Nov 29 02:31:47 crc kubenswrapper[4749]: E1129 02:31:47.919762 4749 kuberuntime_manager.go:1274] "Unhandled Error" err=< Nov 29 02:31:47 crc kubenswrapper[4749]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8v7nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-n82s7_openstack(18dad77e-04e7-4b89-a24f-db71c3ac78ec): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/18dad77e-04e7-4b89-a24f-db71c3ac78ec/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Nov 29 02:31:47 crc kubenswrapper[4749]: > logger="UnhandledError" Nov 29 02:31:47 crc kubenswrapper[4749]: E1129 02:31:47.922318 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/18dad77e-04e7-4b89-a24f-db71c3ac78ec/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" podUID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.225940 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:31:48 crc kubenswrapper[4749]: W1129 02:31:48.235703 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75b1e0f_e99a_4afd_a3f6_f1454f09ddad.slice/crio-a876528b251e2d9b3a2faad72da949a5f072a20c358bb010eb89524881babd06 WatchSource:0}: Error finding container a876528b251e2d9b3a2faad72da949a5f072a20c358bb010eb89524881babd06: Status 404 returned error can't find the container with id a876528b251e2d9b3a2faad72da949a5f072a20c358bb010eb89524881babd06 Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.332592 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:31:48 crc kubenswrapper[4749]: W1129 02:31:48.337148 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6695b886_1903_4ce0_a4d6_c0b230522074.slice/crio-cc57cded592f47d6181f4311e12b016b58555d6715ff4032d2a57de9aa6ca76c WatchSource:0}: Error finding container cc57cded592f47d6181f4311e12b016b58555d6715ff4032d2a57de9aa6ca76c: Status 404 returned error can't find the container with id cc57cded592f47d6181f4311e12b016b58555d6715ff4032d2a57de9aa6ca76c Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.630607 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.632860 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.637256 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.637302 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.637911 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.638106 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8jb5m" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.639046 4749 generic.go:334] "Generic (PLEG): container finished" podID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" containerID="4640918e3c76b95a1e9077c59fbd2cdba5959bf9a591fc29dbeb9a9772c47abd" exitCode=0 Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.639156 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" event={"ID":"2915b260-1e9b-4711-bbcb-0f04e4385aa0","Type":"ContainerDied","Data":"4640918e3c76b95a1e9077c59fbd2cdba5959bf9a591fc29dbeb9a9772c47abd"} Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.643998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad","Type":"ContainerStarted","Data":"a876528b251e2d9b3a2faad72da949a5f072a20c358bb010eb89524881babd06"} Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.646280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6695b886-1903-4ce0-a4d6-c0b230522074","Type":"ContainerStarted","Data":"cc57cded592f47d6181f4311e12b016b58555d6715ff4032d2a57de9aa6ca76c"} Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.652975 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.655374 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.740888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcz7z\" (UniqueName: \"kubernetes.io/projected/0aadf221-92e1-4c07-8f50-4a5e503b8870-kube-api-access-hcz7z\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.741313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aadf221-92e1-4c07-8f50-4a5e503b8870-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.741368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.741396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-config-data-default\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.741429 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0aadf221-92e1-4c07-8f50-4a5e503b8870-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.741532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fb1bc21-a601-4c1a-bcc9-38e1be58c799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fb1bc21-a601-4c1a-bcc9-38e1be58c799\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.741575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aadf221-92e1-4c07-8f50-4a5e503b8870-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.741624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-kolla-config\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.843357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aadf221-92e1-4c07-8f50-4a5e503b8870-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.843440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-kolla-config\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.843505 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcz7z\" (UniqueName: \"kubernetes.io/projected/0aadf221-92e1-4c07-8f50-4a5e503b8870-kube-api-access-hcz7z\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.843556 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aadf221-92e1-4c07-8f50-4a5e503b8870-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.843592 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.843622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-config-data-default\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.843670 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0aadf221-92e1-4c07-8f50-4a5e503b8870-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.844370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0aadf221-92e1-4c07-8f50-4a5e503b8870-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.844663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-kolla-config\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.845524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-config-data-default\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.845838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aadf221-92e1-4c07-8f50-4a5e503b8870-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.846035 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fb1bc21-a601-4c1a-bcc9-38e1be58c799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fb1bc21-a601-4c1a-bcc9-38e1be58c799\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.869178 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.869463 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fb1bc21-a601-4c1a-bcc9-38e1be58c799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fb1bc21-a601-4c1a-bcc9-38e1be58c799\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9ee163215e16d32e92b0e7f0cea256036c8f8b86c6cc2054ebbfa29c671ef5f3/globalmount\"" pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.903033 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0aadf221-92e1-4c07-8f50-4a5e503b8870-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.903907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcz7z\" (UniqueName: \"kubernetes.io/projected/0aadf221-92e1-4c07-8f50-4a5e503b8870-kube-api-access-hcz7z\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:48 crc kubenswrapper[4749]: I1129 02:31:48.904554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0aadf221-92e1-4c07-8f50-4a5e503b8870-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.012316 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.013434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.016519 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.022059 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-ltvcc" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.030460 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fb1bc21-a601-4c1a-bcc9-38e1be58c799\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fb1bc21-a601-4c1a-bcc9-38e1be58c799\") pod \"openstack-galera-0\" (UID: \"0aadf221-92e1-4c07-8f50-4a5e503b8870\") " pod="openstack/openstack-galera-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.032869 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.052464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6g2r\" (UniqueName: \"kubernetes.io/projected/0564ef85-5416-4525-8e67-55cd54992646-kube-api-access-h6g2r\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.052532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0564ef85-5416-4525-8e67-55cd54992646-config-data\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.052600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0564ef85-5416-4525-8e67-55cd54992646-kolla-config\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.154243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6g2r\" (UniqueName: \"kubernetes.io/projected/0564ef85-5416-4525-8e67-55cd54992646-kube-api-access-h6g2r\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.154319 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0564ef85-5416-4525-8e67-55cd54992646-config-data\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.154379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0564ef85-5416-4525-8e67-55cd54992646-kolla-config\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.155335 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0564ef85-5416-4525-8e67-55cd54992646-config-data\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.155510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0564ef85-5416-4525-8e67-55cd54992646-kolla-config\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.175452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6g2r\" (UniqueName: \"kubernetes.io/projected/0564ef85-5416-4525-8e67-55cd54992646-kube-api-access-h6g2r\") pod \"memcached-0\" (UID: \"0564ef85-5416-4525-8e67-55cd54992646\") " pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.258076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.361645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 29 02:31:49 crc kubenswrapper[4749]: I1129 02:31:49.709842 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.037397 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.194444 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.196512 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.202278 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.202462 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.202588 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rz5k2" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.202779 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.256695 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.293307 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.293369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.293605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/203e2625-2d50-46c8-b781-5f7fd1304777-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.293682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.293744 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8451623f-6287-4dea-8ded-80f91012a5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8451623f-6287-4dea-8ded-80f91012a5c0\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.293786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48h59\" (UniqueName: \"kubernetes.io/projected/203e2625-2d50-46c8-b781-5f7fd1304777-kube-api-access-48h59\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.293822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203e2625-2d50-46c8-b781-5f7fd1304777-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.293885 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/203e2625-2d50-46c8-b781-5f7fd1304777-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.395316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.395396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.395483 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/203e2625-2d50-46c8-b781-5f7fd1304777-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.395533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.395569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8451623f-6287-4dea-8ded-80f91012a5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8451623f-6287-4dea-8ded-80f91012a5c0\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.395597 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48h59\" (UniqueName: \"kubernetes.io/projected/203e2625-2d50-46c8-b781-5f7fd1304777-kube-api-access-48h59\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.395619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203e2625-2d50-46c8-b781-5f7fd1304777-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.395660 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/203e2625-2d50-46c8-b781-5f7fd1304777-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.396690 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.396806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.397208 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/203e2625-2d50-46c8-b781-5f7fd1304777-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.398160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/203e2625-2d50-46c8-b781-5f7fd1304777-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.400053 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/203e2625-2d50-46c8-b781-5f7fd1304777-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.404261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203e2625-2d50-46c8-b781-5f7fd1304777-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.415597 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.415632 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8451623f-6287-4dea-8ded-80f91012a5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8451623f-6287-4dea-8ded-80f91012a5c0\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68addeb2b12fdbb659a0cecd7eb1c96a575651289c80f1b7282fb32984c5d322/globalmount\"" pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.423473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48h59\" (UniqueName: \"kubernetes.io/projected/203e2625-2d50-46c8-b781-5f7fd1304777-kube-api-access-48h59\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.437363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8451623f-6287-4dea-8ded-80f91012a5c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8451623f-6287-4dea-8ded-80f91012a5c0\") pod \"openstack-cell1-galera-0\" (UID: \"203e2625-2d50-46c8-b781-5f7fd1304777\") " pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.566864 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.667353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" event={"ID":"18dad77e-04e7-4b89-a24f-db71c3ac78ec","Type":"ContainerStarted","Data":"5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364"} Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.668062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.680499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6695b886-1903-4ce0-a4d6-c0b230522074","Type":"ContainerStarted","Data":"916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928"} Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.682673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0564ef85-5416-4525-8e67-55cd54992646","Type":"ContainerStarted","Data":"e5ae36bea9804ca804d9b99ffa3c3218842b2000a63cbec9982d0af89843ccd9"} Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.682714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0564ef85-5416-4525-8e67-55cd54992646","Type":"ContainerStarted","Data":"9d3d4cc8549b152d6369f192bb026ad25a172744a255e207195c357090d8347e"} Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.683975 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.692595 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" event={"ID":"2915b260-1e9b-4711-bbcb-0f04e4385aa0","Type":"ContainerStarted","Data":"399cd8848f9fb0f807537512005e6486714160725cedccc715bf8db9def2a1ed"} Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.693374 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.696570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0aadf221-92e1-4c07-8f50-4a5e503b8870","Type":"ContainerStarted","Data":"c0293ae1ff8c150579a73ae03a063676ddec94bbad577c50b350f0df5337632c"} Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.696725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0aadf221-92e1-4c07-8f50-4a5e503b8870","Type":"ContainerStarted","Data":"b71691e7a76641fc7359e31173858c68ff8c567f462a95aea293c59e7004e04a"} Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.698464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad","Type":"ContainerStarted","Data":"cf40cd595d06f4a927f7d796b6fa8a44283515e4b110ceb3b0d7409178238c7d"} Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.710866 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" podStartSLOduration=5.710850004 podStartE2EDuration="5.710850004s" podCreationTimestamp="2025-11-29 02:31:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:31:50.692889311 +0000 UTC m=+4853.865039208" watchObservedRunningTime="2025-11-29 02:31:50.710850004 +0000 UTC m=+4853.882999871" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.720068 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" podStartSLOduration=4.720055056 podStartE2EDuration="4.720055056s" podCreationTimestamp="2025-11-29 02:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:31:50.714239516 +0000 UTC m=+4853.886389413" watchObservedRunningTime="2025-11-29 02:31:50.720055056 +0000 UTC m=+4853.892204923" Nov 29 02:31:50 crc kubenswrapper[4749]: I1129 02:31:50.758413 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.758320728 podStartE2EDuration="2.758320728s" podCreationTimestamp="2025-11-29 02:31:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:31:50.753350568 +0000 UTC m=+4853.925500435" watchObservedRunningTime="2025-11-29 02:31:50.758320728 +0000 UTC m=+4853.930470585" Nov 29 02:31:51 crc kubenswrapper[4749]: I1129 02:31:51.065120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 29 02:31:51 crc kubenswrapper[4749]: I1129 02:31:51.709871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"203e2625-2d50-46c8-b781-5f7fd1304777","Type":"ContainerStarted","Data":"193d7be2fce0eba8b9e86cbfb8cc892fd7fc3270a6cb7ffc36ffde5c07f0a891"} Nov 29 02:31:51 crc kubenswrapper[4749]: I1129 02:31:51.710300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"203e2625-2d50-46c8-b781-5f7fd1304777","Type":"ContainerStarted","Data":"f44c3ea0feb8e25553986dbf7f0d0194b0264802605bf53d4dffb3841a95f522"} Nov 29 02:31:53 crc kubenswrapper[4749]: I1129 02:31:53.731416 4749 generic.go:334] "Generic (PLEG): container finished" podID="0aadf221-92e1-4c07-8f50-4a5e503b8870" containerID="c0293ae1ff8c150579a73ae03a063676ddec94bbad577c50b350f0df5337632c" exitCode=0 Nov 29 02:31:53 crc kubenswrapper[4749]: I1129 02:31:53.731472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0aadf221-92e1-4c07-8f50-4a5e503b8870","Type":"ContainerDied","Data":"c0293ae1ff8c150579a73ae03a063676ddec94bbad577c50b350f0df5337632c"} Nov 29 02:31:54 crc kubenswrapper[4749]: I1129 02:31:54.744657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0aadf221-92e1-4c07-8f50-4a5e503b8870","Type":"ContainerStarted","Data":"96f65628fa59008e5d08dfe9b1ce0a0677d79d1f384650e31c2802363b02fb3b"} Nov 29 02:31:54 crc kubenswrapper[4749]: I1129 02:31:54.777173 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.777153016 podStartE2EDuration="7.777153016s" podCreationTimestamp="2025-11-29 02:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:31:54.770470365 +0000 UTC m=+4857.942620222" watchObservedRunningTime="2025-11-29 02:31:54.777153016 +0000 UTC m=+4857.949302873" Nov 29 02:31:55 crc kubenswrapper[4749]: I1129 02:31:55.758515 4749 generic.go:334] "Generic (PLEG): container finished" podID="203e2625-2d50-46c8-b781-5f7fd1304777" containerID="193d7be2fce0eba8b9e86cbfb8cc892fd7fc3270a6cb7ffc36ffde5c07f0a891" exitCode=0 Nov 29 02:31:55 crc kubenswrapper[4749]: I1129 02:31:55.758634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"203e2625-2d50-46c8-b781-5f7fd1304777","Type":"ContainerDied","Data":"193d7be2fce0eba8b9e86cbfb8cc892fd7fc3270a6cb7ffc36ffde5c07f0a891"} Nov 29 02:31:56 crc kubenswrapper[4749]: I1129 02:31:56.312763 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:56 crc kubenswrapper[4749]: I1129 02:31:56.600357 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:31:56 crc kubenswrapper[4749]: I1129 02:31:56.653867 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-n82s7"] Nov 29 02:31:56 crc kubenswrapper[4749]: I1129 02:31:56.769472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"203e2625-2d50-46c8-b781-5f7fd1304777","Type":"ContainerStarted","Data":"da6fdb34e5808d292c156aed91878217c316edfcd7f3a90a5f6c16e67909d317"} Nov 29 02:31:56 crc kubenswrapper[4749]: I1129 02:31:56.769740 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" podUID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" containerName="dnsmasq-dns" containerID="cri-o://5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364" gracePeriod=10 Nov 29 02:31:56 crc kubenswrapper[4749]: I1129 02:31:56.803291 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.803272743 podStartE2EDuration="7.803272743s" podCreationTimestamp="2025-11-29 02:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:31:56.795439945 +0000 UTC m=+4859.967589812" watchObservedRunningTime="2025-11-29 02:31:56.803272743 +0000 UTC m=+4859.975422600" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.761889 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.793561 4749 generic.go:334] "Generic (PLEG): container finished" podID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" containerID="5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364" exitCode=0 Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.793613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" event={"ID":"18dad77e-04e7-4b89-a24f-db71c3ac78ec","Type":"ContainerDied","Data":"5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364"} Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.793633 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.793648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-n82s7" event={"ID":"18dad77e-04e7-4b89-a24f-db71c3ac78ec","Type":"ContainerDied","Data":"9115aa6d178fe6e6a742a0e3f29599598aa1f09a18dd67c235292d5a7e7cb746"} Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.793668 4749 scope.go:117] "RemoveContainer" containerID="5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.822252 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v7nk\" (UniqueName: \"kubernetes.io/projected/18dad77e-04e7-4b89-a24f-db71c3ac78ec-kube-api-access-8v7nk\") pod \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.822362 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-config\") pod \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.822535 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-dns-svc\") pod \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\" (UID: \"18dad77e-04e7-4b89-a24f-db71c3ac78ec\") " Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.828474 4749 scope.go:117] "RemoveContainer" containerID="4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.830903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18dad77e-04e7-4b89-a24f-db71c3ac78ec-kube-api-access-8v7nk" (OuterVolumeSpecName: "kube-api-access-8v7nk") pod "18dad77e-04e7-4b89-a24f-db71c3ac78ec" (UID: "18dad77e-04e7-4b89-a24f-db71c3ac78ec"). InnerVolumeSpecName "kube-api-access-8v7nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.871591 4749 scope.go:117] "RemoveContainer" containerID="5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364" Nov 29 02:31:57 crc kubenswrapper[4749]: E1129 02:31:57.872174 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364\": container with ID starting with 5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364 not found: ID does not exist" containerID="5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.872368 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364"} err="failed to get container status \"5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364\": rpc error: code = NotFound desc = could not find container \"5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364\": container with ID starting with 5416d7fae3add8fde66e8cca11defa6f018758f316aeebd2b5a97e2e0e82f364 not found: ID does not exist" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.872423 4749 scope.go:117] "RemoveContainer" containerID="4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691" Nov 29 02:31:57 crc kubenswrapper[4749]: E1129 02:31:57.872829 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691\": container with ID starting with 4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691 not found: ID does not exist" containerID="4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.872882 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691"} err="failed to get container status \"4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691\": rpc error: code = NotFound desc = could not find container \"4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691\": container with ID starting with 4e71310572e9366c9d758587eabe1d4b5ca8daa15a542d247c05b2223c4f6691 not found: ID does not exist" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.879551 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-config" (OuterVolumeSpecName: "config") pod "18dad77e-04e7-4b89-a24f-db71c3ac78ec" (UID: "18dad77e-04e7-4b89-a24f-db71c3ac78ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.898635 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18dad77e-04e7-4b89-a24f-db71c3ac78ec" (UID: "18dad77e-04e7-4b89-a24f-db71c3ac78ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.925237 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.925311 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v7nk\" (UniqueName: \"kubernetes.io/projected/18dad77e-04e7-4b89-a24f-db71c3ac78ec-kube-api-access-8v7nk\") on node \"crc\" DevicePath \"\"" Nov 29 02:31:57 crc kubenswrapper[4749]: I1129 02:31:57.925340 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18dad77e-04e7-4b89-a24f-db71c3ac78ec-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:31:58 crc kubenswrapper[4749]: I1129 02:31:58.148128 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-n82s7"] Nov 29 02:31:58 crc kubenswrapper[4749]: I1129 02:31:58.160082 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-n82s7"] Nov 29 02:31:59 crc kubenswrapper[4749]: I1129 02:31:59.091583 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" path="/var/lib/kubelet/pods/18dad77e-04e7-4b89-a24f-db71c3ac78ec/volumes" Nov 29 02:31:59 crc kubenswrapper[4749]: I1129 02:31:59.258338 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 29 02:31:59 crc kubenswrapper[4749]: I1129 02:31:59.258427 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 29 02:31:59 crc kubenswrapper[4749]: I1129 02:31:59.363394 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 29 02:32:00 crc kubenswrapper[4749]: I1129 02:32:00.567385 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 29 02:32:00 crc kubenswrapper[4749]: I1129 02:32:00.567480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 29 02:32:02 crc kubenswrapper[4749]: I1129 02:32:02.019099 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 29 02:32:02 crc kubenswrapper[4749]: I1129 02:32:02.125769 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 29 02:32:02 crc kubenswrapper[4749]: I1129 02:32:02.853042 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 29 02:32:02 crc kubenswrapper[4749]: I1129 02:32:02.955148 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 29 02:32:23 crc kubenswrapper[4749]: I1129 02:32:23.035698 4749 generic.go:334] "Generic (PLEG): container finished" podID="6695b886-1903-4ce0-a4d6-c0b230522074" containerID="916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928" exitCode=0 Nov 29 02:32:23 crc kubenswrapper[4749]: I1129 02:32:23.035835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6695b886-1903-4ce0-a4d6-c0b230522074","Type":"ContainerDied","Data":"916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928"} Nov 29 02:32:23 crc kubenswrapper[4749]: I1129 02:32:23.039927 4749 generic.go:334] "Generic (PLEG): container finished" podID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" containerID="cf40cd595d06f4a927f7d796b6fa8a44283515e4b110ceb3b0d7409178238c7d" exitCode=0 Nov 29 02:32:23 crc kubenswrapper[4749]: I1129 02:32:23.039965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad","Type":"ContainerDied","Data":"cf40cd595d06f4a927f7d796b6fa8a44283515e4b110ceb3b0d7409178238c7d"} Nov 29 02:32:24 crc kubenswrapper[4749]: I1129 02:32:24.048299 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad","Type":"ContainerStarted","Data":"37bd6c094aa1a10f8a5e94805a5c46a07c77866f24b4482efae1548c4cf1ed09"} Nov 29 02:32:24 crc kubenswrapper[4749]: I1129 02:32:24.048683 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:24 crc kubenswrapper[4749]: I1129 02:32:24.050890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6695b886-1903-4ce0-a4d6-c0b230522074","Type":"ContainerStarted","Data":"76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136"} Nov 29 02:32:24 crc kubenswrapper[4749]: I1129 02:32:24.051147 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 29 02:32:24 crc kubenswrapper[4749]: I1129 02:32:24.086872 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.086853244 podStartE2EDuration="38.086853244s" podCreationTimestamp="2025-11-29 02:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:32:24.086399103 +0000 UTC m=+4887.258549000" watchObservedRunningTime="2025-11-29 02:32:24.086853244 +0000 UTC m=+4887.259003111" Nov 29 02:32:24 crc kubenswrapper[4749]: I1129 02:32:24.105316 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.105296889 podStartE2EDuration="38.105296889s" podCreationTimestamp="2025-11-29 02:31:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:32:24.103516236 +0000 UTC m=+4887.275666123" watchObservedRunningTime="2025-11-29 02:32:24.105296889 +0000 UTC m=+4887.277446766" Nov 29 02:32:37 crc kubenswrapper[4749]: I1129 02:32:37.710120 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:37 crc kubenswrapper[4749]: I1129 02:32:37.794421 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.512073 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-88dhl"] Nov 29 02:32:44 crc kubenswrapper[4749]: E1129 02:32:44.514456 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" containerName="init" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.514574 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" containerName="init" Nov 29 02:32:44 crc kubenswrapper[4749]: E1129 02:32:44.514670 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" containerName="dnsmasq-dns" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.514751 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" containerName="dnsmasq-dns" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.515045 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="18dad77e-04e7-4b89-a24f-db71c3ac78ec" containerName="dnsmasq-dns" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.516164 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.536906 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-88dhl"] Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.692605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-config\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.692977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr48c\" (UniqueName: \"kubernetes.io/projected/9994b917-838f-4a83-9ef5-1d79cda66294-kube-api-access-qr48c\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.693039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.794963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-config\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.795426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr48c\" (UniqueName: \"kubernetes.io/projected/9994b917-838f-4a83-9ef5-1d79cda66294-kube-api-access-qr48c\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.795477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.796413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-config\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.796428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.833783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr48c\" (UniqueName: \"kubernetes.io/projected/9994b917-838f-4a83-9ef5-1d79cda66294-kube-api-access-qr48c\") pod \"dnsmasq-dns-5b7946d7b9-88dhl\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:44 crc kubenswrapper[4749]: I1129 02:32:44.875857 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:45 crc kubenswrapper[4749]: I1129 02:32:45.157532 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-88dhl"] Nov 29 02:32:45 crc kubenswrapper[4749]: W1129 02:32:45.162140 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9994b917_838f_4a83_9ef5_1d79cda66294.slice/crio-0ee0d67d7299e7fbfe9f1134a148120b9cb819c3b0e28602fac8f1c4365bdb02 WatchSource:0}: Error finding container 0ee0d67d7299e7fbfe9f1134a148120b9cb819c3b0e28602fac8f1c4365bdb02: Status 404 returned error can't find the container with id 0ee0d67d7299e7fbfe9f1134a148120b9cb819c3b0e28602fac8f1c4365bdb02 Nov 29 02:32:45 crc kubenswrapper[4749]: I1129 02:32:45.245940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" event={"ID":"9994b917-838f-4a83-9ef5-1d79cda66294","Type":"ContainerStarted","Data":"0ee0d67d7299e7fbfe9f1134a148120b9cb819c3b0e28602fac8f1c4365bdb02"} Nov 29 02:32:45 crc kubenswrapper[4749]: I1129 02:32:45.320468 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:32:46 crc kubenswrapper[4749]: I1129 02:32:46.032192 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:32:46 crc kubenswrapper[4749]: I1129 02:32:46.253130 4749 generic.go:334] "Generic (PLEG): container finished" podID="9994b917-838f-4a83-9ef5-1d79cda66294" containerID="9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730" exitCode=0 Nov 29 02:32:46 crc kubenswrapper[4749]: I1129 02:32:46.253167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" event={"ID":"9994b917-838f-4a83-9ef5-1d79cda66294","Type":"ContainerDied","Data":"9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730"} Nov 29 02:32:47 crc kubenswrapper[4749]: I1129 02:32:47.150926 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6695b886-1903-4ce0-a4d6-c0b230522074" containerName="rabbitmq" containerID="cri-o://76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136" gracePeriod=604799 Nov 29 02:32:47 crc kubenswrapper[4749]: I1129 02:32:47.264444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" event={"ID":"9994b917-838f-4a83-9ef5-1d79cda66294","Type":"ContainerStarted","Data":"e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3"} Nov 29 02:32:47 crc kubenswrapper[4749]: I1129 02:32:47.265644 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:47 crc kubenswrapper[4749]: I1129 02:32:47.287718 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" podStartSLOduration=3.287699355 podStartE2EDuration="3.287699355s" podCreationTimestamp="2025-11-29 02:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:32:47.285104243 +0000 UTC m=+4910.457254130" watchObservedRunningTime="2025-11-29 02:32:47.287699355 +0000 UTC m=+4910.459849222" Nov 29 02:32:47 crc kubenswrapper[4749]: I1129 02:32:47.792636 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6695b886-1903-4ce0-a4d6-c0b230522074" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.237:5672: connect: connection refused" Nov 29 02:32:47 crc kubenswrapper[4749]: I1129 02:32:47.980439 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" containerName="rabbitmq" containerID="cri-o://37bd6c094aa1a10f8a5e94805a5c46a07c77866f24b4482efae1548c4cf1ed09" gracePeriod=604799 Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.773018 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962443 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962485 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-server-conf\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-confd\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962530 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-erlang-cookie\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962556 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6695b886-1903-4ce0-a4d6-c0b230522074-pod-info\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-plugins-conf\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962618 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6695b886-1903-4ce0-a4d6-c0b230522074-erlang-cookie-secret\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6fnz\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-kube-api-access-t6fnz\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.962680 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-plugins\") pod \"6695b886-1903-4ce0-a4d6-c0b230522074\" (UID: \"6695b886-1903-4ce0-a4d6-c0b230522074\") " Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.963109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.963180 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.964288 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.970856 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6695b886-1903-4ce0-a4d6-c0b230522074-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.970894 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-kube-api-access-t6fnz" (OuterVolumeSpecName: "kube-api-access-t6fnz") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "kube-api-access-t6fnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.972537 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6695b886-1903-4ce0-a4d6-c0b230522074-pod-info" (OuterVolumeSpecName: "pod-info") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.981318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47" (OuterVolumeSpecName: "persistence") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "pvc-aba1533d-9f62-45f6-af1a-c2781d240c47". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 02:32:53 crc kubenswrapper[4749]: I1129 02:32:53.992110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-server-conf" (OuterVolumeSpecName: "server-conf") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.032682 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6695b886-1903-4ce0-a4d6-c0b230522074" (UID: "6695b886-1903-4ce0-a4d6-c0b230522074"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064575 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") on node \"crc\" " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064616 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-server-conf\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064631 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064645 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064657 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6695b886-1903-4ce0-a4d6-c0b230522074-pod-info\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064668 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6695b886-1903-4ce0-a4d6-c0b230522074-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064679 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6695b886-1903-4ce0-a4d6-c0b230522074-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064690 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6fnz\" (UniqueName: \"kubernetes.io/projected/6695b886-1903-4ce0-a4d6-c0b230522074-kube-api-access-t6fnz\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.064701 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6695b886-1903-4ce0-a4d6-c0b230522074-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.085106 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.085513 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-aba1533d-9f62-45f6-af1a-c2781d240c47" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47") on node "crc" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.166898 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.338021 4749 generic.go:334] "Generic (PLEG): container finished" podID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" containerID="37bd6c094aa1a10f8a5e94805a5c46a07c77866f24b4482efae1548c4cf1ed09" exitCode=0 Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.338084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad","Type":"ContainerDied","Data":"37bd6c094aa1a10f8a5e94805a5c46a07c77866f24b4482efae1548c4cf1ed09"} Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.340287 4749 generic.go:334] "Generic (PLEG): container finished" podID="6695b886-1903-4ce0-a4d6-c0b230522074" containerID="76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136" exitCode=0 Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.340324 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6695b886-1903-4ce0-a4d6-c0b230522074","Type":"ContainerDied","Data":"76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136"} Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.340362 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6695b886-1903-4ce0-a4d6-c0b230522074","Type":"ContainerDied","Data":"cc57cded592f47d6181f4311e12b016b58555d6715ff4032d2a57de9aa6ca76c"} Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.340360 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.340379 4749 scope.go:117] "RemoveContainer" containerID="76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.377484 4749 scope.go:117] "RemoveContainer" containerID="916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.383606 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.395226 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.411646 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:32:54 crc kubenswrapper[4749]: E1129 02:32:54.412605 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6695b886-1903-4ce0-a4d6-c0b230522074" containerName="rabbitmq" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.412623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6695b886-1903-4ce0-a4d6-c0b230522074" containerName="rabbitmq" Nov 29 02:32:54 crc kubenswrapper[4749]: E1129 02:32:54.412682 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6695b886-1903-4ce0-a4d6-c0b230522074" containerName="setup-container" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.412691 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6695b886-1903-4ce0-a4d6-c0b230522074" containerName="setup-container" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.413098 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6695b886-1903-4ce0-a4d6-c0b230522074" containerName="rabbitmq" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.415847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.420721 4749 scope.go:117] "RemoveContainer" containerID="76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.422360 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.422672 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 29 02:32:54 crc kubenswrapper[4749]: E1129 02:32:54.422733 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136\": container with ID starting with 76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136 not found: ID does not exist" containerID="76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.422782 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136"} err="failed to get container status \"76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136\": rpc error: code = NotFound desc = could not find container \"76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136\": container with ID starting with 76ee75f4a401b4e424181c6e8b5a449708e8a32bf4ab98c1a643769b305d6136 not found: ID does not exist" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.422810 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.422815 4749 scope.go:117] "RemoveContainer" containerID="916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.422840 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fdkdr" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.422913 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 29 02:32:54 crc kubenswrapper[4749]: E1129 02:32:54.423353 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928\": container with ID starting with 916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928 not found: ID does not exist" containerID="916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.423389 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928"} err="failed to get container status \"916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928\": rpc error: code = NotFound desc = could not find container \"916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928\": container with ID starting with 916047488f9c4d13f178e9622bc93f46ce61065f19a283c127d6681a82af5928 not found: ID does not exist" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.435867 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b55c421-1415-42f3-a604-93a5405aa469-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574780 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g79d\" (UniqueName: \"kubernetes.io/projected/2b55c421-1415-42f3-a604-93a5405aa469-kube-api-access-8g79d\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b55c421-1415-42f3-a604-93a5405aa469-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b55c421-1415-42f3-a604-93a5405aa469-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b55c421-1415-42f3-a604-93a5405aa469-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.574922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.609779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b55c421-1415-42f3-a604-93a5405aa469-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681736 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g79d\" (UniqueName: \"kubernetes.io/projected/2b55c421-1415-42f3-a604-93a5405aa469-kube-api-access-8g79d\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681768 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b55c421-1415-42f3-a604-93a5405aa469-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b55c421-1415-42f3-a604-93a5405aa469-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.681901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b55c421-1415-42f3-a604-93a5405aa469-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.684405 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.685113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.685416 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b55c421-1415-42f3-a604-93a5405aa469-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.685599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b55c421-1415-42f3-a604-93a5405aa469-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.686962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b55c421-1415-42f3-a604-93a5405aa469-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.688029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b55c421-1415-42f3-a604-93a5405aa469-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.689671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b55c421-1415-42f3-a604-93a5405aa469-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.714331 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.714393 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fb30d21bff812042c3ba73d535733d4a541c8291c9170e06c00ec0a2b33d56dc/globalmount\"" pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.718596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g79d\" (UniqueName: \"kubernetes.io/projected/2b55c421-1415-42f3-a604-93a5405aa469-kube-api-access-8g79d\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.758777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aba1533d-9f62-45f6-af1a-c2781d240c47\") pod \"rabbitmq-server-0\" (UID: \"2b55c421-1415-42f3-a604-93a5405aa469\") " pod="openstack/rabbitmq-server-0" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.783521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-plugins\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-pod-info\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-erlang-cookie\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-confd\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784246 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-plugins-conf\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784267 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q52v\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-kube-api-access-4q52v\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784381 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784498 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-server-conf\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784704 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-erlang-cookie-secret\") pod \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\" (UID: \"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad\") " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.784015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.785651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.785718 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.788579 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-kube-api-access-4q52v" (OuterVolumeSpecName: "kube-api-access-4q52v") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "kube-api-access-4q52v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.790418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-pod-info" (OuterVolumeSpecName: "pod-info") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.790528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.799038 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb" (OuterVolumeSpecName: "persistence") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.810392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-server-conf" (OuterVolumeSpecName: "server-conf") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.871715 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" (UID: "a75b1e0f-e99a-4afd-a3f6-f1454f09ddad"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.877342 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886480 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886501 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q52v\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-kube-api-access-4q52v\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886532 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") on node \"crc\" " Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886542 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-server-conf\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886552 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886562 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886571 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-pod-info\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886579 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.886587 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.908359 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.908562 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb") on node "crc" Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.947386 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-jj8gj"] Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.947660 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" podUID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" containerName="dnsmasq-dns" containerID="cri-o://399cd8848f9fb0f807537512005e6486714160725cedccc715bf8db9def2a1ed" gracePeriod=10 Nov 29 02:32:54 crc kubenswrapper[4749]: I1129 02:32:54.987667 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.050650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.104046 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6695b886-1903-4ce0-a4d6-c0b230522074" path="/var/lib/kubelet/pods/6695b886-1903-4ce0-a4d6-c0b230522074/volumes" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.351850 4749 generic.go:334] "Generic (PLEG): container finished" podID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" containerID="399cd8848f9fb0f807537512005e6486714160725cedccc715bf8db9def2a1ed" exitCode=0 Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.351934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" event={"ID":"2915b260-1e9b-4711-bbcb-0f04e4385aa0","Type":"ContainerDied","Data":"399cd8848f9fb0f807537512005e6486714160725cedccc715bf8db9def2a1ed"} Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.354545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a75b1e0f-e99a-4afd-a3f6-f1454f09ddad","Type":"ContainerDied","Data":"a876528b251e2d9b3a2faad72da949a5f072a20c358bb010eb89524881babd06"} Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.354597 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.354603 4749 scope.go:117] "RemoveContainer" containerID="37bd6c094aa1a10f8a5e94805a5c46a07c77866f24b4482efae1548c4cf1ed09" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.376567 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.379141 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.380502 4749 scope.go:117] "RemoveContainer" containerID="cf40cd595d06f4a927f7d796b6fa8a44283515e4b110ceb3b0d7409178238c7d" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.384251 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.408981 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:32:55 crc kubenswrapper[4749]: E1129 02:32:55.409385 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" containerName="init" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.409400 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" containerName="init" Nov 29 02:32:55 crc kubenswrapper[4749]: E1129 02:32:55.409413 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" containerName="dnsmasq-dns" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.409421 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" containerName="dnsmasq-dns" Nov 29 02:32:55 crc kubenswrapper[4749]: E1129 02:32:55.409436 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" containerName="setup-container" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.409446 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" containerName="setup-container" Nov 29 02:32:55 crc kubenswrapper[4749]: E1129 02:32:55.409465 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" containerName="rabbitmq" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.409473 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" containerName="rabbitmq" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.409661 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" containerName="dnsmasq-dns" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.409677 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" containerName="rabbitmq" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.410644 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.412628 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kxh5r" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.412953 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.413186 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.413365 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.413502 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.427551 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.494006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-dns-svc\") pod \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.494090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-config\") pod \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.494332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgwc2\" (UniqueName: \"kubernetes.io/projected/2915b260-1e9b-4711-bbcb-0f04e4385aa0-kube-api-access-pgwc2\") pod \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\" (UID: \"2915b260-1e9b-4711-bbcb-0f04e4385aa0\") " Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.503086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2915b260-1e9b-4711-bbcb-0f04e4385aa0-kube-api-access-pgwc2" (OuterVolumeSpecName: "kube-api-access-pgwc2") pod "2915b260-1e9b-4711-bbcb-0f04e4385aa0" (UID: "2915b260-1e9b-4711-bbcb-0f04e4385aa0"). InnerVolumeSpecName "kube-api-access-pgwc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.526536 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2915b260-1e9b-4711-bbcb-0f04e4385aa0" (UID: "2915b260-1e9b-4711-bbcb-0f04e4385aa0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.532140 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-config" (OuterVolumeSpecName: "config") pod "2915b260-1e9b-4711-bbcb-0f04e4385aa0" (UID: "2915b260-1e9b-4711-bbcb-0f04e4385aa0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.597877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.597942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.597967 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598243 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wqk\" (UniqueName: \"kubernetes.io/projected/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-kube-api-access-p2wqk\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598405 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgwc2\" (UniqueName: \"kubernetes.io/projected/2915b260-1e9b-4711-bbcb-0f04e4385aa0-kube-api-access-pgwc2\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598422 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.598613 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2915b260-1e9b-4711-bbcb-0f04e4385aa0-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.619055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.700125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.700484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wqk\" (UniqueName: \"kubernetes.io/projected/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-kube-api-access-p2wqk\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.700634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.700749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.700835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.700921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.701012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.701090 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.701187 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.701924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.702383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.702551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.705475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.708691 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.708745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.709075 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.710432 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.710459 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c51c2b13dbf0b9756210c6f61ae1019431eed3ed4124bcd7795448ba4b882c80/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.724317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wqk\" (UniqueName: \"kubernetes.io/projected/a0af1f7d-07fd-485b-ba0e-57d4e2a1c781-kube-api-access-p2wqk\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.751161 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e232b68-fbe7-46ef-a77d-840cf8f750fb\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781\") " pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:55 crc kubenswrapper[4749]: I1129 02:32:55.774653 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.021675 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 29 02:32:56 crc kubenswrapper[4749]: W1129 02:32:56.029825 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0af1f7d_07fd_485b_ba0e_57d4e2a1c781.slice/crio-d72d9d274ee83cccc71fa0f12a6c3da58d9b89057bff8b80228c1543135d24b4 WatchSource:0}: Error finding container d72d9d274ee83cccc71fa0f12a6c3da58d9b89057bff8b80228c1543135d24b4: Status 404 returned error can't find the container with id d72d9d274ee83cccc71fa0f12a6c3da58d9b89057bff8b80228c1543135d24b4 Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.370264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b55c421-1415-42f3-a604-93a5405aa469","Type":"ContainerStarted","Data":"d63e36009a6f4030dccc39b0f373ef317995879123cc65b456cbe4a11f6e1605"} Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.373617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" event={"ID":"2915b260-1e9b-4711-bbcb-0f04e4385aa0","Type":"ContainerDied","Data":"73b20c646271be95ec0b35047c5f80ba9a1553c5e96fd4a795350f1b564f5b20"} Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.373713 4749 scope.go:117] "RemoveContainer" containerID="399cd8848f9fb0f807537512005e6486714160725cedccc715bf8db9def2a1ed" Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.374484 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-jj8gj" Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.375310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781","Type":"ContainerStarted","Data":"d72d9d274ee83cccc71fa0f12a6c3da58d9b89057bff8b80228c1543135d24b4"} Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.413517 4749 scope.go:117] "RemoveContainer" containerID="4640918e3c76b95a1e9077c59fbd2cdba5959bf9a591fc29dbeb9a9772c47abd" Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.495067 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-jj8gj"] Nov 29 02:32:56 crc kubenswrapper[4749]: I1129 02:32:56.502166 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-jj8gj"] Nov 29 02:32:57 crc kubenswrapper[4749]: I1129 02:32:57.088643 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2915b260-1e9b-4711-bbcb-0f04e4385aa0" path="/var/lib/kubelet/pods/2915b260-1e9b-4711-bbcb-0f04e4385aa0/volumes" Nov 29 02:32:57 crc kubenswrapper[4749]: I1129 02:32:57.091049 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75b1e0f-e99a-4afd-a3f6-f1454f09ddad" path="/var/lib/kubelet/pods/a75b1e0f-e99a-4afd-a3f6-f1454f09ddad/volumes" Nov 29 02:32:58 crc kubenswrapper[4749]: I1129 02:32:58.399104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b55c421-1415-42f3-a604-93a5405aa469","Type":"ContainerStarted","Data":"a22203fb84a340fc7ecc5faaee2acc7f70d2e6b5f312805e6c2a2def7a2f90f6"} Nov 29 02:32:58 crc kubenswrapper[4749]: I1129 02:32:58.401334 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781","Type":"ContainerStarted","Data":"bb4a11a6e4ea9c85e7f9acfd1931943f5a85d2bae4e219c1a5a261baa2e9e84d"} Nov 29 02:33:25 crc kubenswrapper[4749]: I1129 02:33:25.373911 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:33:25 crc kubenswrapper[4749]: I1129 02:33:25.374409 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:33:30 crc kubenswrapper[4749]: I1129 02:33:30.761254 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0af1f7d-07fd-485b-ba0e-57d4e2a1c781" containerID="bb4a11a6e4ea9c85e7f9acfd1931943f5a85d2bae4e219c1a5a261baa2e9e84d" exitCode=0 Nov 29 02:33:30 crc kubenswrapper[4749]: I1129 02:33:30.761394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781","Type":"ContainerDied","Data":"bb4a11a6e4ea9c85e7f9acfd1931943f5a85d2bae4e219c1a5a261baa2e9e84d"} Nov 29 02:33:30 crc kubenswrapper[4749]: I1129 02:33:30.768854 4749 generic.go:334] "Generic (PLEG): container finished" podID="2b55c421-1415-42f3-a604-93a5405aa469" containerID="a22203fb84a340fc7ecc5faaee2acc7f70d2e6b5f312805e6c2a2def7a2f90f6" exitCode=0 Nov 29 02:33:30 crc kubenswrapper[4749]: I1129 02:33:30.768930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b55c421-1415-42f3-a604-93a5405aa469","Type":"ContainerDied","Data":"a22203fb84a340fc7ecc5faaee2acc7f70d2e6b5f312805e6c2a2def7a2f90f6"} Nov 29 02:33:31 crc kubenswrapper[4749]: I1129 02:33:31.779031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b55c421-1415-42f3-a604-93a5405aa469","Type":"ContainerStarted","Data":"fe75c2c99c5ffa45c59201954934cd7bfbe526caec35aadb643941014c3ab149"} Nov 29 02:33:31 crc kubenswrapper[4749]: I1129 02:33:31.779721 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 29 02:33:31 crc kubenswrapper[4749]: I1129 02:33:31.781741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0af1f7d-07fd-485b-ba0e-57d4e2a1c781","Type":"ContainerStarted","Data":"6aae7f8faf2f148975eb6211654b8250b4d99e9c206bfb14db8c8f851b6c3c0b"} Nov 29 02:33:31 crc kubenswrapper[4749]: I1129 02:33:31.781976 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:33:31 crc kubenswrapper[4749]: I1129 02:33:31.817347 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.817332628 podStartE2EDuration="37.817332628s" podCreationTimestamp="2025-11-29 02:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:33:31.81454211 +0000 UTC m=+4954.986691967" watchObservedRunningTime="2025-11-29 02:33:31.817332628 +0000 UTC m=+4954.989482485" Nov 29 02:33:31 crc kubenswrapper[4749]: I1129 02:33:31.852171 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.852151877 podStartE2EDuration="36.852151877s" podCreationTimestamp="2025-11-29 02:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:33:31.836767796 +0000 UTC m=+4955.008917673" watchObservedRunningTime="2025-11-29 02:33:31.852151877 +0000 UTC m=+4955.024301734" Nov 29 02:33:45 crc kubenswrapper[4749]: I1129 02:33:45.055500 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 29 02:33:45 crc kubenswrapper[4749]: I1129 02:33:45.777746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 29 02:33:53 crc kubenswrapper[4749]: I1129 02:33:53.871686 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Nov 29 02:33:53 crc kubenswrapper[4749]: I1129 02:33:53.873467 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 29 02:33:53 crc kubenswrapper[4749]: I1129 02:33:53.875358 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-j85bz" Nov 29 02:33:53 crc kubenswrapper[4749]: I1129 02:33:53.880875 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 29 02:33:53 crc kubenswrapper[4749]: I1129 02:33:53.972482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8dlj\" (UniqueName: \"kubernetes.io/projected/e86a95eb-5376-4e88-bbac-ab9aebd0da88-kube-api-access-g8dlj\") pod \"mariadb-client-1-default\" (UID: \"e86a95eb-5376-4e88-bbac-ab9aebd0da88\") " pod="openstack/mariadb-client-1-default" Nov 29 02:33:54 crc kubenswrapper[4749]: I1129 02:33:54.074129 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8dlj\" (UniqueName: \"kubernetes.io/projected/e86a95eb-5376-4e88-bbac-ab9aebd0da88-kube-api-access-g8dlj\") pod \"mariadb-client-1-default\" (UID: \"e86a95eb-5376-4e88-bbac-ab9aebd0da88\") " pod="openstack/mariadb-client-1-default" Nov 29 02:33:54 crc kubenswrapper[4749]: I1129 02:33:54.104496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8dlj\" (UniqueName: \"kubernetes.io/projected/e86a95eb-5376-4e88-bbac-ab9aebd0da88-kube-api-access-g8dlj\") pod \"mariadb-client-1-default\" (UID: \"e86a95eb-5376-4e88-bbac-ab9aebd0da88\") " pod="openstack/mariadb-client-1-default" Nov 29 02:33:54 crc kubenswrapper[4749]: I1129 02:33:54.200952 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 29 02:33:54 crc kubenswrapper[4749]: I1129 02:33:54.801971 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 29 02:33:54 crc kubenswrapper[4749]: W1129 02:33:54.811432 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode86a95eb_5376_4e88_bbac_ab9aebd0da88.slice/crio-c010aec2b769b4cd4b4dfb1b7d244e8a76d83eac5f3a99ee2362c8732d6a9280 WatchSource:0}: Error finding container c010aec2b769b4cd4b4dfb1b7d244e8a76d83eac5f3a99ee2362c8732d6a9280: Status 404 returned error can't find the container with id c010aec2b769b4cd4b4dfb1b7d244e8a76d83eac5f3a99ee2362c8732d6a9280 Nov 29 02:33:54 crc kubenswrapper[4749]: I1129 02:33:54.990967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"e86a95eb-5376-4e88-bbac-ab9aebd0da88","Type":"ContainerStarted","Data":"c010aec2b769b4cd4b4dfb1b7d244e8a76d83eac5f3a99ee2362c8732d6a9280"} Nov 29 02:33:55 crc kubenswrapper[4749]: I1129 02:33:55.374065 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:33:55 crc kubenswrapper[4749]: I1129 02:33:55.374121 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:33:56 crc kubenswrapper[4749]: I1129 02:33:56.002369 4749 generic.go:334] "Generic (PLEG): container finished" podID="e86a95eb-5376-4e88-bbac-ab9aebd0da88" containerID="9a9ae982b8e5d2f60686f788fc9a7a37ed26a51af3e168cc560fe5d02a830bbf" exitCode=0 Nov 29 02:33:56 crc kubenswrapper[4749]: I1129 02:33:56.002472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"e86a95eb-5376-4e88-bbac-ab9aebd0da88","Type":"ContainerDied","Data":"9a9ae982b8e5d2f60686f788fc9a7a37ed26a51af3e168cc560fe5d02a830bbf"} Nov 29 02:33:57 crc kubenswrapper[4749]: I1129 02:33:57.470059 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 29 02:33:57 crc kubenswrapper[4749]: I1129 02:33:57.506258 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_e86a95eb-5376-4e88-bbac-ab9aebd0da88/mariadb-client-1-default/0.log" Nov 29 02:33:57 crc kubenswrapper[4749]: I1129 02:33:57.531922 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 29 02:33:57 crc kubenswrapper[4749]: I1129 02:33:57.537501 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 29 02:33:57 crc kubenswrapper[4749]: I1129 02:33:57.554339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8dlj\" (UniqueName: \"kubernetes.io/projected/e86a95eb-5376-4e88-bbac-ab9aebd0da88-kube-api-access-g8dlj\") pod \"e86a95eb-5376-4e88-bbac-ab9aebd0da88\" (UID: \"e86a95eb-5376-4e88-bbac-ab9aebd0da88\") " Nov 29 02:33:57 crc kubenswrapper[4749]: I1129 02:33:57.560540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86a95eb-5376-4e88-bbac-ab9aebd0da88-kube-api-access-g8dlj" (OuterVolumeSpecName: "kube-api-access-g8dlj") pod "e86a95eb-5376-4e88-bbac-ab9aebd0da88" (UID: "e86a95eb-5376-4e88-bbac-ab9aebd0da88"). InnerVolumeSpecName "kube-api-access-g8dlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:33:57 crc kubenswrapper[4749]: I1129 02:33:57.656600 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8dlj\" (UniqueName: \"kubernetes.io/projected/e86a95eb-5376-4e88-bbac-ab9aebd0da88-kube-api-access-g8dlj\") on node \"crc\" DevicePath \"\"" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.026321 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c010aec2b769b4cd4b4dfb1b7d244e8a76d83eac5f3a99ee2362c8732d6a9280" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.026362 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.080361 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Nov 29 02:33:58 crc kubenswrapper[4749]: E1129 02:33:58.080898 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86a95eb-5376-4e88-bbac-ab9aebd0da88" containerName="mariadb-client-1-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.080927 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86a95eb-5376-4e88-bbac-ab9aebd0da88" containerName="mariadb-client-1-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.081223 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86a95eb-5376-4e88-bbac-ab9aebd0da88" containerName="mariadb-client-1-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.082079 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.086268 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-j85bz" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.091499 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.165949 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6pw\" (UniqueName: \"kubernetes.io/projected/032bd504-7804-4889-9166-a59999bede0a-kube-api-access-kb6pw\") pod \"mariadb-client-2-default\" (UID: \"032bd504-7804-4889-9166-a59999bede0a\") " pod="openstack/mariadb-client-2-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.268527 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6pw\" (UniqueName: \"kubernetes.io/projected/032bd504-7804-4889-9166-a59999bede0a-kube-api-access-kb6pw\") pod \"mariadb-client-2-default\" (UID: \"032bd504-7804-4889-9166-a59999bede0a\") " pod="openstack/mariadb-client-2-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.295913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6pw\" (UniqueName: \"kubernetes.io/projected/032bd504-7804-4889-9166-a59999bede0a-kube-api-access-kb6pw\") pod \"mariadb-client-2-default\" (UID: \"032bd504-7804-4889-9166-a59999bede0a\") " pod="openstack/mariadb-client-2-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.412291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 29 02:33:58 crc kubenswrapper[4749]: I1129 02:33:58.787281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 29 02:33:58 crc kubenswrapper[4749]: W1129 02:33:58.794077 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod032bd504_7804_4889_9166_a59999bede0a.slice/crio-22a0ca16690ee140e4252588048944c244d1b8b2d8e96c6fcfd2ff6c433d523e WatchSource:0}: Error finding container 22a0ca16690ee140e4252588048944c244d1b8b2d8e96c6fcfd2ff6c433d523e: Status 404 returned error can't find the container with id 22a0ca16690ee140e4252588048944c244d1b8b2d8e96c6fcfd2ff6c433d523e Nov 29 02:33:59 crc kubenswrapper[4749]: I1129 02:33:59.037378 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"032bd504-7804-4889-9166-a59999bede0a","Type":"ContainerStarted","Data":"e5b72ca19569b9e57e824122e8dafcd2a4bd05e1e4a554464ed33844c83d55ea"} Nov 29 02:33:59 crc kubenswrapper[4749]: I1129 02:33:59.037454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"032bd504-7804-4889-9166-a59999bede0a","Type":"ContainerStarted","Data":"22a0ca16690ee140e4252588048944c244d1b8b2d8e96c6fcfd2ff6c433d523e"} Nov 29 02:33:59 crc kubenswrapper[4749]: I1129 02:33:59.067610 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.067583935 podStartE2EDuration="1.067583935s" podCreationTimestamp="2025-11-29 02:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:33:59.066683953 +0000 UTC m=+4982.238833850" watchObservedRunningTime="2025-11-29 02:33:59.067583935 +0000 UTC m=+4982.239733822" Nov 29 02:33:59 crc kubenswrapper[4749]: I1129 02:33:59.088982 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86a95eb-5376-4e88-bbac-ab9aebd0da88" path="/var/lib/kubelet/pods/e86a95eb-5376-4e88-bbac-ab9aebd0da88/volumes" Nov 29 02:34:00 crc kubenswrapper[4749]: I1129 02:34:00.049459 4749 generic.go:334] "Generic (PLEG): container finished" podID="032bd504-7804-4889-9166-a59999bede0a" containerID="e5b72ca19569b9e57e824122e8dafcd2a4bd05e1e4a554464ed33844c83d55ea" exitCode=1 Nov 29 02:34:00 crc kubenswrapper[4749]: I1129 02:34:00.049526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"032bd504-7804-4889-9166-a59999bede0a","Type":"ContainerDied","Data":"e5b72ca19569b9e57e824122e8dafcd2a4bd05e1e4a554464ed33844c83d55ea"} Nov 29 02:34:01 crc kubenswrapper[4749]: I1129 02:34:01.576262 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 29 02:34:01 crc kubenswrapper[4749]: I1129 02:34:01.633021 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 29 02:34:01 crc kubenswrapper[4749]: I1129 02:34:01.645661 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 29 02:34:01 crc kubenswrapper[4749]: I1129 02:34:01.725971 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb6pw\" (UniqueName: \"kubernetes.io/projected/032bd504-7804-4889-9166-a59999bede0a-kube-api-access-kb6pw\") pod \"032bd504-7804-4889-9166-a59999bede0a\" (UID: \"032bd504-7804-4889-9166-a59999bede0a\") " Nov 29 02:34:01 crc kubenswrapper[4749]: I1129 02:34:01.735978 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/032bd504-7804-4889-9166-a59999bede0a-kube-api-access-kb6pw" (OuterVolumeSpecName: "kube-api-access-kb6pw") pod "032bd504-7804-4889-9166-a59999bede0a" (UID: "032bd504-7804-4889-9166-a59999bede0a"). InnerVolumeSpecName "kube-api-access-kb6pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:01 crc kubenswrapper[4749]: I1129 02:34:01.828823 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb6pw\" (UniqueName: \"kubernetes.io/projected/032bd504-7804-4889-9166-a59999bede0a-kube-api-access-kb6pw\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.089531 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22a0ca16690ee140e4252588048944c244d1b8b2d8e96c6fcfd2ff6c433d523e" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.089621 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.181492 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Nov 29 02:34:02 crc kubenswrapper[4749]: E1129 02:34:02.182024 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="032bd504-7804-4889-9166-a59999bede0a" containerName="mariadb-client-2-default" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.182049 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="032bd504-7804-4889-9166-a59999bede0a" containerName="mariadb-client-2-default" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.182462 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="032bd504-7804-4889-9166-a59999bede0a" containerName="mariadb-client-2-default" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.183385 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.193227 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.229298 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-j85bz" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.238565 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g598\" (UniqueName: \"kubernetes.io/projected/ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7-kube-api-access-7g598\") pod \"mariadb-client-1\" (UID: \"ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7\") " pod="openstack/mariadb-client-1" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.340391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g598\" (UniqueName: \"kubernetes.io/projected/ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7-kube-api-access-7g598\") pod \"mariadb-client-1\" (UID: \"ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7\") " pod="openstack/mariadb-client-1" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.378372 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g598\" (UniqueName: \"kubernetes.io/projected/ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7-kube-api-access-7g598\") pod \"mariadb-client-1\" (UID: \"ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7\") " pod="openstack/mariadb-client-1" Nov 29 02:34:02 crc kubenswrapper[4749]: I1129 02:34:02.559077 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 29 02:34:03 crc kubenswrapper[4749]: I1129 02:34:03.096015 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032bd504-7804-4889-9166-a59999bede0a" path="/var/lib/kubelet/pods/032bd504-7804-4889-9166-a59999bede0a/volumes" Nov 29 02:34:03 crc kubenswrapper[4749]: I1129 02:34:03.201327 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Nov 29 02:34:03 crc kubenswrapper[4749]: W1129 02:34:03.203666 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba7c9711_a7c0_4fe3_8fce_8bdbb761b9e7.slice/crio-1e8d5c037f1aa16d07247f4efece0a418849188d8311e3e8a2d5b46aed2288ed WatchSource:0}: Error finding container 1e8d5c037f1aa16d07247f4efece0a418849188d8311e3e8a2d5b46aed2288ed: Status 404 returned error can't find the container with id 1e8d5c037f1aa16d07247f4efece0a418849188d8311e3e8a2d5b46aed2288ed Nov 29 02:34:04 crc kubenswrapper[4749]: I1129 02:34:04.114357 4749 generic.go:334] "Generic (PLEG): container finished" podID="ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7" containerID="1f8aae235fd879794a524e3cc6c815a2e62a10bf87a95be4f90b947dbef4d72a" exitCode=0 Nov 29 02:34:04 crc kubenswrapper[4749]: I1129 02:34:04.114434 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7","Type":"ContainerDied","Data":"1f8aae235fd879794a524e3cc6c815a2e62a10bf87a95be4f90b947dbef4d72a"} Nov 29 02:34:04 crc kubenswrapper[4749]: I1129 02:34:04.114499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7","Type":"ContainerStarted","Data":"1e8d5c037f1aa16d07247f4efece0a418849188d8311e3e8a2d5b46aed2288ed"} Nov 29 02:34:05 crc kubenswrapper[4749]: I1129 02:34:05.619393 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 29 02:34:05 crc kubenswrapper[4749]: I1129 02:34:05.639849 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7/mariadb-client-1/0.log" Nov 29 02:34:05 crc kubenswrapper[4749]: I1129 02:34:05.698919 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Nov 29 02:34:05 crc kubenswrapper[4749]: I1129 02:34:05.700506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g598\" (UniqueName: \"kubernetes.io/projected/ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7-kube-api-access-7g598\") pod \"ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7\" (UID: \"ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7\") " Nov 29 02:34:05 crc kubenswrapper[4749]: I1129 02:34:05.707987 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7-kube-api-access-7g598" (OuterVolumeSpecName: "kube-api-access-7g598") pod "ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7" (UID: "ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7"). InnerVolumeSpecName "kube-api-access-7g598". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:05 crc kubenswrapper[4749]: I1129 02:34:05.711884 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Nov 29 02:34:05 crc kubenswrapper[4749]: I1129 02:34:05.802835 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g598\" (UniqueName: \"kubernetes.io/projected/ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7-kube-api-access-7g598\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.137244 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8d5c037f1aa16d07247f4efece0a418849188d8311e3e8a2d5b46aed2288ed" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.137297 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.166508 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Nov 29 02:34:06 crc kubenswrapper[4749]: E1129 02:34:06.166999 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7" containerName="mariadb-client-1" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.167028 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7" containerName="mariadb-client-1" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.167368 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7" containerName="mariadb-client-1" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.168157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.171632 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-j85bz" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.185655 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.311595 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-649mh\" (UniqueName: \"kubernetes.io/projected/6e73d899-8d60-4a75-88c3-781cfa9bb31f-kube-api-access-649mh\") pod \"mariadb-client-4-default\" (UID: \"6e73d899-8d60-4a75-88c3-781cfa9bb31f\") " pod="openstack/mariadb-client-4-default" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.413348 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-649mh\" (UniqueName: \"kubernetes.io/projected/6e73d899-8d60-4a75-88c3-781cfa9bb31f-kube-api-access-649mh\") pod \"mariadb-client-4-default\" (UID: \"6e73d899-8d60-4a75-88c3-781cfa9bb31f\") " pod="openstack/mariadb-client-4-default" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.444827 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-649mh\" (UniqueName: \"kubernetes.io/projected/6e73d899-8d60-4a75-88c3-781cfa9bb31f-kube-api-access-649mh\") pod \"mariadb-client-4-default\" (UID: \"6e73d899-8d60-4a75-88c3-781cfa9bb31f\") " pod="openstack/mariadb-client-4-default" Nov 29 02:34:06 crc kubenswrapper[4749]: I1129 02:34:06.504602 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 29 02:34:07 crc kubenswrapper[4749]: I1129 02:34:06.956509 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 29 02:34:07 crc kubenswrapper[4749]: W1129 02:34:07.005461 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e73d899_8d60_4a75_88c3_781cfa9bb31f.slice/crio-0bda3f701c7cc147401bc55285b2de9b6ae76f72ea2bf5c41d24443dba41277c WatchSource:0}: Error finding container 0bda3f701c7cc147401bc55285b2de9b6ae76f72ea2bf5c41d24443dba41277c: Status 404 returned error can't find the container with id 0bda3f701c7cc147401bc55285b2de9b6ae76f72ea2bf5c41d24443dba41277c Nov 29 02:34:07 crc kubenswrapper[4749]: I1129 02:34:07.115681 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7" path="/var/lib/kubelet/pods/ba7c9711-a7c0-4fe3-8fce-8bdbb761b9e7/volumes" Nov 29 02:34:07 crc kubenswrapper[4749]: I1129 02:34:07.145806 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"6e73d899-8d60-4a75-88c3-781cfa9bb31f","Type":"ContainerStarted","Data":"0bda3f701c7cc147401bc55285b2de9b6ae76f72ea2bf5c41d24443dba41277c"} Nov 29 02:34:08 crc kubenswrapper[4749]: I1129 02:34:08.160576 4749 generic.go:334] "Generic (PLEG): container finished" podID="6e73d899-8d60-4a75-88c3-781cfa9bb31f" containerID="009c5d59fe51ebe1a9aab29a36d065d2edb2cac00cac356c98ac15e1e518d118" exitCode=0 Nov 29 02:34:08 crc kubenswrapper[4749]: I1129 02:34:08.160670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"6e73d899-8d60-4a75-88c3-781cfa9bb31f","Type":"ContainerDied","Data":"009c5d59fe51ebe1a9aab29a36d065d2edb2cac00cac356c98ac15e1e518d118"} Nov 29 02:34:09 crc kubenswrapper[4749]: I1129 02:34:09.616567 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 29 02:34:09 crc kubenswrapper[4749]: I1129 02:34:09.641970 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_6e73d899-8d60-4a75-88c3-781cfa9bb31f/mariadb-client-4-default/0.log" Nov 29 02:34:09 crc kubenswrapper[4749]: I1129 02:34:09.669557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-649mh\" (UniqueName: \"kubernetes.io/projected/6e73d899-8d60-4a75-88c3-781cfa9bb31f-kube-api-access-649mh\") pod \"6e73d899-8d60-4a75-88c3-781cfa9bb31f\" (UID: \"6e73d899-8d60-4a75-88c3-781cfa9bb31f\") " Nov 29 02:34:09 crc kubenswrapper[4749]: I1129 02:34:09.674456 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 29 02:34:09 crc kubenswrapper[4749]: I1129 02:34:09.678543 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e73d899-8d60-4a75-88c3-781cfa9bb31f-kube-api-access-649mh" (OuterVolumeSpecName: "kube-api-access-649mh") pod "6e73d899-8d60-4a75-88c3-781cfa9bb31f" (UID: "6e73d899-8d60-4a75-88c3-781cfa9bb31f"). InnerVolumeSpecName "kube-api-access-649mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:09 crc kubenswrapper[4749]: I1129 02:34:09.679006 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 29 02:34:09 crc kubenswrapper[4749]: I1129 02:34:09.772712 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-649mh\" (UniqueName: \"kubernetes.io/projected/6e73d899-8d60-4a75-88c3-781cfa9bb31f-kube-api-access-649mh\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:10 crc kubenswrapper[4749]: I1129 02:34:10.185569 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bda3f701c7cc147401bc55285b2de9b6ae76f72ea2bf5c41d24443dba41277c" Nov 29 02:34:10 crc kubenswrapper[4749]: I1129 02:34:10.185669 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 29 02:34:11 crc kubenswrapper[4749]: I1129 02:34:11.091003 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e73d899-8d60-4a75-88c3-781cfa9bb31f" path="/var/lib/kubelet/pods/6e73d899-8d60-4a75-88c3-781cfa9bb31f/volumes" Nov 29 02:34:13 crc kubenswrapper[4749]: I1129 02:34:13.624896 4749 scope.go:117] "RemoveContainer" containerID="ffa33d9e4d0351fb4f2c8a7d81a5e2c55b6d6ce9fd8ee92124ba30e011b039fc" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.355591 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Nov 29 02:34:14 crc kubenswrapper[4749]: E1129 02:34:14.356924 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e73d899-8d60-4a75-88c3-781cfa9bb31f" containerName="mariadb-client-4-default" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.357041 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e73d899-8d60-4a75-88c3-781cfa9bb31f" containerName="mariadb-client-4-default" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.357335 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e73d899-8d60-4a75-88c3-781cfa9bb31f" containerName="mariadb-client-4-default" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.358172 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.362095 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-j85bz" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.370696 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.460188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9975v\" (UniqueName: \"kubernetes.io/projected/7be9df34-a358-46ae-9e34-5654406b83c1-kube-api-access-9975v\") pod \"mariadb-client-5-default\" (UID: \"7be9df34-a358-46ae-9e34-5654406b83c1\") " pod="openstack/mariadb-client-5-default" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.562446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9975v\" (UniqueName: \"kubernetes.io/projected/7be9df34-a358-46ae-9e34-5654406b83c1-kube-api-access-9975v\") pod \"mariadb-client-5-default\" (UID: \"7be9df34-a358-46ae-9e34-5654406b83c1\") " pod="openstack/mariadb-client-5-default" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.593695 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9975v\" (UniqueName: \"kubernetes.io/projected/7be9df34-a358-46ae-9e34-5654406b83c1-kube-api-access-9975v\") pod \"mariadb-client-5-default\" (UID: \"7be9df34-a358-46ae-9e34-5654406b83c1\") " pod="openstack/mariadb-client-5-default" Nov 29 02:34:14 crc kubenswrapper[4749]: I1129 02:34:14.690874 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 29 02:34:15 crc kubenswrapper[4749]: I1129 02:34:15.373601 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 29 02:34:16 crc kubenswrapper[4749]: I1129 02:34:16.254826 4749 generic.go:334] "Generic (PLEG): container finished" podID="7be9df34-a358-46ae-9e34-5654406b83c1" containerID="d110251ce2f74fbcbb495fbe4f6e9ea91a32b6cea2d96ce51646fa2c8080ecdd" exitCode=0 Nov 29 02:34:16 crc kubenswrapper[4749]: I1129 02:34:16.254950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"7be9df34-a358-46ae-9e34-5654406b83c1","Type":"ContainerDied","Data":"d110251ce2f74fbcbb495fbe4f6e9ea91a32b6cea2d96ce51646fa2c8080ecdd"} Nov 29 02:34:16 crc kubenswrapper[4749]: I1129 02:34:16.256540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"7be9df34-a358-46ae-9e34-5654406b83c1","Type":"ContainerStarted","Data":"dd66677209cf60524aeb4781c1fe271b1e46ee1957769799e41ebc0e1a45582d"} Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.473292 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lsdxf"] Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.477393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.481402 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lsdxf"] Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.517307 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-catalog-content\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.517376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-utilities\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.517415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6sd\" (UniqueName: \"kubernetes.io/projected/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-kube-api-access-kf6sd\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.619211 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-catalog-content\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.619277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-utilities\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.619301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6sd\" (UniqueName: \"kubernetes.io/projected/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-kube-api-access-kf6sd\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.620038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-catalog-content\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.620070 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-utilities\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.644697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6sd\" (UniqueName: \"kubernetes.io/projected/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-kube-api-access-kf6sd\") pod \"certified-operators-lsdxf\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.738245 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.760302 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_7be9df34-a358-46ae-9e34-5654406b83c1/mariadb-client-5-default/0.log" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.793299 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.801315 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.801755 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.822563 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9975v\" (UniqueName: \"kubernetes.io/projected/7be9df34-a358-46ae-9e34-5654406b83c1-kube-api-access-9975v\") pod \"7be9df34-a358-46ae-9e34-5654406b83c1\" (UID: \"7be9df34-a358-46ae-9e34-5654406b83c1\") " Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.829357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be9df34-a358-46ae-9e34-5654406b83c1-kube-api-access-9975v" (OuterVolumeSpecName: "kube-api-access-9975v") pod "7be9df34-a358-46ae-9e34-5654406b83c1" (UID: "7be9df34-a358-46ae-9e34-5654406b83c1"). InnerVolumeSpecName "kube-api-access-9975v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.924017 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9975v\" (UniqueName: \"kubernetes.io/projected/7be9df34-a358-46ae-9e34-5654406b83c1-kube-api-access-9975v\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.990705 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Nov 29 02:34:17 crc kubenswrapper[4749]: E1129 02:34:17.990995 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be9df34-a358-46ae-9e34-5654406b83c1" containerName="mariadb-client-5-default" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.991012 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be9df34-a358-46ae-9e34-5654406b83c1" containerName="mariadb-client-5-default" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.991141 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be9df34-a358-46ae-9e34-5654406b83c1" containerName="mariadb-client-5-default" Nov 29 02:34:17 crc kubenswrapper[4749]: I1129 02:34:17.991649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.007914 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.025691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbw8q\" (UniqueName: \"kubernetes.io/projected/8cd4f576-cc90-429e-afa9-b3b9166707e0-kube-api-access-nbw8q\") pod \"mariadb-client-6-default\" (UID: \"8cd4f576-cc90-429e-afa9-b3b9166707e0\") " pod="openstack/mariadb-client-6-default" Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.126944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbw8q\" (UniqueName: \"kubernetes.io/projected/8cd4f576-cc90-429e-afa9-b3b9166707e0-kube-api-access-nbw8q\") pod \"mariadb-client-6-default\" (UID: \"8cd4f576-cc90-429e-afa9-b3b9166707e0\") " pod="openstack/mariadb-client-6-default" Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.146469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbw8q\" (UniqueName: \"kubernetes.io/projected/8cd4f576-cc90-429e-afa9-b3b9166707e0-kube-api-access-nbw8q\") pod \"mariadb-client-6-default\" (UID: \"8cd4f576-cc90-429e-afa9-b3b9166707e0\") " pod="openstack/mariadb-client-6-default" Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.236696 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lsdxf"] Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.274461 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.275441 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd66677209cf60524aeb4781c1fe271b1e46ee1957769799e41ebc0e1a45582d" Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.278294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsdxf" event={"ID":"4e608ac1-362b-4e26-b3b9-18f7eb3187ab","Type":"ContainerStarted","Data":"ac2ba4995ea2ede4228361968f19aac27d6553b06c8a1844f5acb823cfa63827"} Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.311392 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 29 02:34:18 crc kubenswrapper[4749]: I1129 02:34:18.672607 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 29 02:34:18 crc kubenswrapper[4749]: W1129 02:34:18.673490 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cd4f576_cc90_429e_afa9_b3b9166707e0.slice/crio-0ae1799950c163c2bff20f71a17b5fa7e482f7bb489ca35eac9f86673d5506ca WatchSource:0}: Error finding container 0ae1799950c163c2bff20f71a17b5fa7e482f7bb489ca35eac9f86673d5506ca: Status 404 returned error can't find the container with id 0ae1799950c163c2bff20f71a17b5fa7e482f7bb489ca35eac9f86673d5506ca Nov 29 02:34:19 crc kubenswrapper[4749]: I1129 02:34:19.087436 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be9df34-a358-46ae-9e34-5654406b83c1" path="/var/lib/kubelet/pods/7be9df34-a358-46ae-9e34-5654406b83c1/volumes" Nov 29 02:34:19 crc kubenswrapper[4749]: I1129 02:34:19.291276 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerID="55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217" exitCode=0 Nov 29 02:34:19 crc kubenswrapper[4749]: I1129 02:34:19.291436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsdxf" event={"ID":"4e608ac1-362b-4e26-b3b9-18f7eb3187ab","Type":"ContainerDied","Data":"55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217"} Nov 29 02:34:19 crc kubenswrapper[4749]: I1129 02:34:19.297689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8cd4f576-cc90-429e-afa9-b3b9166707e0","Type":"ContainerStarted","Data":"1f4fbabe45131a1b877202013712e9c24d10996a5ce276ee9d7ab0b91ef87d64"} Nov 29 02:34:19 crc kubenswrapper[4749]: I1129 02:34:19.297741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8cd4f576-cc90-429e-afa9-b3b9166707e0","Type":"ContainerStarted","Data":"0ae1799950c163c2bff20f71a17b5fa7e482f7bb489ca35eac9f86673d5506ca"} Nov 29 02:34:19 crc kubenswrapper[4749]: I1129 02:34:19.361342 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=2.361314437 podStartE2EDuration="2.361314437s" podCreationTimestamp="2025-11-29 02:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:34:19.350862575 +0000 UTC m=+5002.523012462" watchObservedRunningTime="2025-11-29 02:34:19.361314437 +0000 UTC m=+5002.533464324" Nov 29 02:34:19 crc kubenswrapper[4749]: I1129 02:34:19.433821 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_8cd4f576-cc90-429e-afa9-b3b9166707e0/mariadb-client-6-default/0.log" Nov 29 02:34:20 crc kubenswrapper[4749]: I1129 02:34:20.310720 4749 generic.go:334] "Generic (PLEG): container finished" podID="8cd4f576-cc90-429e-afa9-b3b9166707e0" containerID="1f4fbabe45131a1b877202013712e9c24d10996a5ce276ee9d7ab0b91ef87d64" exitCode=1 Nov 29 02:34:20 crc kubenswrapper[4749]: I1129 02:34:20.310820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8cd4f576-cc90-429e-afa9-b3b9166707e0","Type":"ContainerDied","Data":"1f4fbabe45131a1b877202013712e9c24d10996a5ce276ee9d7ab0b91ef87d64"} Nov 29 02:34:21 crc kubenswrapper[4749]: I1129 02:34:21.326000 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerID="db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80" exitCode=0 Nov 29 02:34:21 crc kubenswrapper[4749]: I1129 02:34:21.326331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsdxf" event={"ID":"4e608ac1-362b-4e26-b3b9-18f7eb3187ab","Type":"ContainerDied","Data":"db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80"} Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.105261 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.176456 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.182793 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.290668 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbw8q\" (UniqueName: \"kubernetes.io/projected/8cd4f576-cc90-429e-afa9-b3b9166707e0-kube-api-access-nbw8q\") pod \"8cd4f576-cc90-429e-afa9-b3b9166707e0\" (UID: \"8cd4f576-cc90-429e-afa9-b3b9166707e0\") " Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.301456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cd4f576-cc90-429e-afa9-b3b9166707e0-kube-api-access-nbw8q" (OuterVolumeSpecName: "kube-api-access-nbw8q") pod "8cd4f576-cc90-429e-afa9-b3b9166707e0" (UID: "8cd4f576-cc90-429e-afa9-b3b9166707e0"). InnerVolumeSpecName "kube-api-access-nbw8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.334116 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Nov 29 02:34:22 crc kubenswrapper[4749]: E1129 02:34:22.336418 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd4f576-cc90-429e-afa9-b3b9166707e0" containerName="mariadb-client-6-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.336447 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd4f576-cc90-429e-afa9-b3b9166707e0" containerName="mariadb-client-6-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.336729 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd4f576-cc90-429e-afa9-b3b9166707e0" containerName="mariadb-client-6-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.337453 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.342784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsdxf" event={"ID":"4e608ac1-362b-4e26-b3b9-18f7eb3187ab","Type":"ContainerStarted","Data":"dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c"} Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.344831 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.346816 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae1799950c163c2bff20f71a17b5fa7e482f7bb489ca35eac9f86673d5506ca" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.346908 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.392332 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lsdxf" podStartSLOduration=2.638463697 podStartE2EDuration="5.392314474s" podCreationTimestamp="2025-11-29 02:34:17 +0000 UTC" firstStartedPulling="2025-11-29 02:34:19.293661246 +0000 UTC m=+5002.465811143" lastFinishedPulling="2025-11-29 02:34:22.047512023 +0000 UTC m=+5005.219661920" observedRunningTime="2025-11-29 02:34:22.389679341 +0000 UTC m=+5005.561829208" watchObservedRunningTime="2025-11-29 02:34:22.392314474 +0000 UTC m=+5005.564464351" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.392718 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbw8q\" (UniqueName: \"kubernetes.io/projected/8cd4f576-cc90-429e-afa9-b3b9166707e0-kube-api-access-nbw8q\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.494504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bsfh\" (UniqueName: \"kubernetes.io/projected/5e78e65b-bfb3-4303-b646-4b235fef4a69-kube-api-access-6bsfh\") pod \"mariadb-client-7-default\" (UID: \"5e78e65b-bfb3-4303-b646-4b235fef4a69\") " pod="openstack/mariadb-client-7-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.596029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bsfh\" (UniqueName: \"kubernetes.io/projected/5e78e65b-bfb3-4303-b646-4b235fef4a69-kube-api-access-6bsfh\") pod \"mariadb-client-7-default\" (UID: \"5e78e65b-bfb3-4303-b646-4b235fef4a69\") " pod="openstack/mariadb-client-7-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.617612 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bsfh\" (UniqueName: \"kubernetes.io/projected/5e78e65b-bfb3-4303-b646-4b235fef4a69-kube-api-access-6bsfh\") pod \"mariadb-client-7-default\" (UID: \"5e78e65b-bfb3-4303-b646-4b235fef4a69\") " pod="openstack/mariadb-client-7-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.672546 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 29 02:34:22 crc kubenswrapper[4749]: I1129 02:34:22.987374 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 29 02:34:23 crc kubenswrapper[4749]: I1129 02:34:23.085017 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cd4f576-cc90-429e-afa9-b3b9166707e0" path="/var/lib/kubelet/pods/8cd4f576-cc90-429e-afa9-b3b9166707e0/volumes" Nov 29 02:34:23 crc kubenswrapper[4749]: I1129 02:34:23.361632 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e78e65b-bfb3-4303-b646-4b235fef4a69" containerID="eeab27a27652c77affae1333a6a009e90a98f829f174f64e2b1edae824206de9" exitCode=0 Nov 29 02:34:23 crc kubenswrapper[4749]: I1129 02:34:23.362085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"5e78e65b-bfb3-4303-b646-4b235fef4a69","Type":"ContainerDied","Data":"eeab27a27652c77affae1333a6a009e90a98f829f174f64e2b1edae824206de9"} Nov 29 02:34:23 crc kubenswrapper[4749]: I1129 02:34:23.362126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"5e78e65b-bfb3-4303-b646-4b235fef4a69","Type":"ContainerStarted","Data":"c318eeb20d81290b09a5fb87b9cda000b85ba0f2c6fe1d314ad918f3efaf114f"} Nov 29 02:34:24 crc kubenswrapper[4749]: I1129 02:34:24.800765 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 29 02:34:24 crc kubenswrapper[4749]: I1129 02:34:24.824499 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_5e78e65b-bfb3-4303-b646-4b235fef4a69/mariadb-client-7-default/0.log" Nov 29 02:34:24 crc kubenswrapper[4749]: I1129 02:34:24.861819 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bsfh\" (UniqueName: \"kubernetes.io/projected/5e78e65b-bfb3-4303-b646-4b235fef4a69-kube-api-access-6bsfh\") pod \"5e78e65b-bfb3-4303-b646-4b235fef4a69\" (UID: \"5e78e65b-bfb3-4303-b646-4b235fef4a69\") " Nov 29 02:34:24 crc kubenswrapper[4749]: I1129 02:34:24.869580 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e78e65b-bfb3-4303-b646-4b235fef4a69-kube-api-access-6bsfh" (OuterVolumeSpecName: "kube-api-access-6bsfh") pod "5e78e65b-bfb3-4303-b646-4b235fef4a69" (UID: "5e78e65b-bfb3-4303-b646-4b235fef4a69"). InnerVolumeSpecName "kube-api-access-6bsfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:24 crc kubenswrapper[4749]: I1129 02:34:24.893756 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 29 02:34:24 crc kubenswrapper[4749]: I1129 02:34:24.902406 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 29 02:34:24 crc kubenswrapper[4749]: I1129 02:34:24.963189 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bsfh\" (UniqueName: \"kubernetes.io/projected/5e78e65b-bfb3-4303-b646-4b235fef4a69-kube-api-access-6bsfh\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.089785 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e78e65b-bfb3-4303-b646-4b235fef4a69" path="/var/lib/kubelet/pods/5e78e65b-bfb3-4303-b646-4b235fef4a69/volumes" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.090700 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Nov 29 02:34:25 crc kubenswrapper[4749]: E1129 02:34:25.090936 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e78e65b-bfb3-4303-b646-4b235fef4a69" containerName="mariadb-client-7-default" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.090951 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e78e65b-bfb3-4303-b646-4b235fef4a69" containerName="mariadb-client-7-default" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.091087 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e78e65b-bfb3-4303-b646-4b235fef4a69" containerName="mariadb-client-7-default" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.091575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.106746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.167119 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m927\" (UniqueName: \"kubernetes.io/projected/83b20f23-3180-413a-93a5-7cb32cb20289-kube-api-access-2m927\") pod \"mariadb-client-2\" (UID: \"83b20f23-3180-413a-93a5-7cb32cb20289\") " pod="openstack/mariadb-client-2" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.269681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m927\" (UniqueName: \"kubernetes.io/projected/83b20f23-3180-413a-93a5-7cb32cb20289-kube-api-access-2m927\") pod \"mariadb-client-2\" (UID: \"83b20f23-3180-413a-93a5-7cb32cb20289\") " pod="openstack/mariadb-client-2" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.292381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m927\" (UniqueName: \"kubernetes.io/projected/83b20f23-3180-413a-93a5-7cb32cb20289-kube-api-access-2m927\") pod \"mariadb-client-2\" (UID: \"83b20f23-3180-413a-93a5-7cb32cb20289\") " pod="openstack/mariadb-client-2" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.374656 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.374743 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.374808 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.375664 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.375771 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" gracePeriod=600 Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.382051 4749 scope.go:117] "RemoveContainer" containerID="eeab27a27652c77affae1333a6a009e90a98f829f174f64e2b1edae824206de9" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.382083 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.420030 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 29 02:34:25 crc kubenswrapper[4749]: E1129 02:34:25.518932 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:34:25 crc kubenswrapper[4749]: I1129 02:34:25.772028 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Nov 29 02:34:25 crc kubenswrapper[4749]: W1129 02:34:25.778414 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83b20f23_3180_413a_93a5_7cb32cb20289.slice/crio-4db8b0bb2a1f6fb5bdd8e4d7168f6bc58372edb83d71f717fd482fc5477cc4c8 WatchSource:0}: Error finding container 4db8b0bb2a1f6fb5bdd8e4d7168f6bc58372edb83d71f717fd482fc5477cc4c8: Status 404 returned error can't find the container with id 4db8b0bb2a1f6fb5bdd8e4d7168f6bc58372edb83d71f717fd482fc5477cc4c8 Nov 29 02:34:26 crc kubenswrapper[4749]: I1129 02:34:26.409782 4749 generic.go:334] "Generic (PLEG): container finished" podID="83b20f23-3180-413a-93a5-7cb32cb20289" containerID="f7671c09b5aea297d95ec8a96829f72d9a8ea2cf989f96bb0766b86cf608358a" exitCode=0 Nov 29 02:34:26 crc kubenswrapper[4749]: I1129 02:34:26.409891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"83b20f23-3180-413a-93a5-7cb32cb20289","Type":"ContainerDied","Data":"f7671c09b5aea297d95ec8a96829f72d9a8ea2cf989f96bb0766b86cf608358a"} Nov 29 02:34:26 crc kubenswrapper[4749]: I1129 02:34:26.409933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"83b20f23-3180-413a-93a5-7cb32cb20289","Type":"ContainerStarted","Data":"4db8b0bb2a1f6fb5bdd8e4d7168f6bc58372edb83d71f717fd482fc5477cc4c8"} Nov 29 02:34:26 crc kubenswrapper[4749]: I1129 02:34:26.420766 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" exitCode=0 Nov 29 02:34:26 crc kubenswrapper[4749]: I1129 02:34:26.420816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24"} Nov 29 02:34:26 crc kubenswrapper[4749]: I1129 02:34:26.420852 4749 scope.go:117] "RemoveContainer" containerID="df2ae3add145d734c1564dba300d08f14f65ca639c4b0a89cde4ceb1f506a682" Nov 29 02:34:26 crc kubenswrapper[4749]: I1129 02:34:26.421484 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:34:26 crc kubenswrapper[4749]: E1129 02:34:26.421758 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:34:27 crc kubenswrapper[4749]: I1129 02:34:27.802171 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:27 crc kubenswrapper[4749]: I1129 02:34:27.802586 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:27 crc kubenswrapper[4749]: I1129 02:34:27.864761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:27 crc kubenswrapper[4749]: I1129 02:34:27.930686 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 29 02:34:27 crc kubenswrapper[4749]: I1129 02:34:27.954597 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_83b20f23-3180-413a-93a5-7cb32cb20289/mariadb-client-2/0.log" Nov 29 02:34:27 crc kubenswrapper[4749]: I1129 02:34:27.985952 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Nov 29 02:34:27 crc kubenswrapper[4749]: I1129 02:34:27.993858 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Nov 29 02:34:28 crc kubenswrapper[4749]: I1129 02:34:28.124818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m927\" (UniqueName: \"kubernetes.io/projected/83b20f23-3180-413a-93a5-7cb32cb20289-kube-api-access-2m927\") pod \"83b20f23-3180-413a-93a5-7cb32cb20289\" (UID: \"83b20f23-3180-413a-93a5-7cb32cb20289\") " Nov 29 02:34:28 crc kubenswrapper[4749]: I1129 02:34:28.133191 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83b20f23-3180-413a-93a5-7cb32cb20289-kube-api-access-2m927" (OuterVolumeSpecName: "kube-api-access-2m927") pod "83b20f23-3180-413a-93a5-7cb32cb20289" (UID: "83b20f23-3180-413a-93a5-7cb32cb20289"). InnerVolumeSpecName "kube-api-access-2m927". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:28 crc kubenswrapper[4749]: I1129 02:34:28.227661 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m927\" (UniqueName: \"kubernetes.io/projected/83b20f23-3180-413a-93a5-7cb32cb20289-kube-api-access-2m927\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:28 crc kubenswrapper[4749]: I1129 02:34:28.494084 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 29 02:34:28 crc kubenswrapper[4749]: I1129 02:34:28.494088 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4db8b0bb2a1f6fb5bdd8e4d7168f6bc58372edb83d71f717fd482fc5477cc4c8" Nov 29 02:34:28 crc kubenswrapper[4749]: I1129 02:34:28.571835 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:28 crc kubenswrapper[4749]: I1129 02:34:28.653783 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lsdxf"] Nov 29 02:34:29 crc kubenswrapper[4749]: I1129 02:34:29.091626 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83b20f23-3180-413a-93a5-7cb32cb20289" path="/var/lib/kubelet/pods/83b20f23-3180-413a-93a5-7cb32cb20289/volumes" Nov 29 02:34:30 crc kubenswrapper[4749]: I1129 02:34:30.510424 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lsdxf" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerName="registry-server" containerID="cri-o://dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c" gracePeriod=2 Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.057889 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.071280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-catalog-content\") pod \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.071400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-utilities\") pod \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.071492 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf6sd\" (UniqueName: \"kubernetes.io/projected/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-kube-api-access-kf6sd\") pod \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\" (UID: \"4e608ac1-362b-4e26-b3b9-18f7eb3187ab\") " Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.072462 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-utilities" (OuterVolumeSpecName: "utilities") pod "4e608ac1-362b-4e26-b3b9-18f7eb3187ab" (UID: "4e608ac1-362b-4e26-b3b9-18f7eb3187ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.082090 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-kube-api-access-kf6sd" (OuterVolumeSpecName: "kube-api-access-kf6sd") pod "4e608ac1-362b-4e26-b3b9-18f7eb3187ab" (UID: "4e608ac1-362b-4e26-b3b9-18f7eb3187ab"). InnerVolumeSpecName "kube-api-access-kf6sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.172915 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.172997 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf6sd\" (UniqueName: \"kubernetes.io/projected/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-kube-api-access-kf6sd\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.238679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e608ac1-362b-4e26-b3b9-18f7eb3187ab" (UID: "4e608ac1-362b-4e26-b3b9-18f7eb3187ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.275296 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e608ac1-362b-4e26-b3b9-18f7eb3187ab-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.526948 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsdxf" event={"ID":"4e608ac1-362b-4e26-b3b9-18f7eb3187ab","Type":"ContainerDied","Data":"dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c"} Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.526984 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lsdxf" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.527065 4749 scope.go:117] "RemoveContainer" containerID="dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.527347 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerID="dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c" exitCode=0 Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.527452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lsdxf" event={"ID":"4e608ac1-362b-4e26-b3b9-18f7eb3187ab","Type":"ContainerDied","Data":"ac2ba4995ea2ede4228361968f19aac27d6553b06c8a1844f5acb823cfa63827"} Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.564123 4749 scope.go:117] "RemoveContainer" containerID="db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.597171 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lsdxf"] Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.617478 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lsdxf"] Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.620566 4749 scope.go:117] "RemoveContainer" containerID="55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.657698 4749 scope.go:117] "RemoveContainer" containerID="dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c" Nov 29 02:34:31 crc kubenswrapper[4749]: E1129 02:34:31.658502 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c\": container with ID starting with dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c not found: ID does not exist" containerID="dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.658572 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c"} err="failed to get container status \"dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c\": rpc error: code = NotFound desc = could not find container \"dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c\": container with ID starting with dadc4f1f5931671904fa717f23102a3435b96b0456d257d43eb60e4811a7166c not found: ID does not exist" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.658614 4749 scope.go:117] "RemoveContainer" containerID="db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80" Nov 29 02:34:31 crc kubenswrapper[4749]: E1129 02:34:31.659539 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80\": container with ID starting with db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80 not found: ID does not exist" containerID="db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.659707 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80"} err="failed to get container status \"db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80\": rpc error: code = NotFound desc = could not find container \"db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80\": container with ID starting with db00d25e16396fcb269d845fd9a7cfa212758afafc21a6780cd76ae105c19b80 not found: ID does not exist" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.659864 4749 scope.go:117] "RemoveContainer" containerID="55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217" Nov 29 02:34:31 crc kubenswrapper[4749]: E1129 02:34:31.660519 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217\": container with ID starting with 55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217 not found: ID does not exist" containerID="55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217" Nov 29 02:34:31 crc kubenswrapper[4749]: I1129 02:34:31.660564 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217"} err="failed to get container status \"55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217\": rpc error: code = NotFound desc = could not find container \"55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217\": container with ID starting with 55f34a338d3b967e5af333f177ae68183973d863c22e307b982fc8635b5a6217 not found: ID does not exist" Nov 29 02:34:33 crc kubenswrapper[4749]: I1129 02:34:33.087024 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" path="/var/lib/kubelet/pods/4e608ac1-362b-4e26-b3b9-18f7eb3187ab/volumes" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.136651 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2w2bn"] Nov 29 02:34:36 crc kubenswrapper[4749]: E1129 02:34:36.138255 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b20f23-3180-413a-93a5-7cb32cb20289" containerName="mariadb-client-2" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.138289 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b20f23-3180-413a-93a5-7cb32cb20289" containerName="mariadb-client-2" Nov 29 02:34:36 crc kubenswrapper[4749]: E1129 02:34:36.138331 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerName="extract-content" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.138348 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerName="extract-content" Nov 29 02:34:36 crc kubenswrapper[4749]: E1129 02:34:36.138376 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerName="extract-utilities" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.138392 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerName="extract-utilities" Nov 29 02:34:36 crc kubenswrapper[4749]: E1129 02:34:36.138416 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerName="registry-server" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.138431 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerName="registry-server" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.138779 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e608ac1-362b-4e26-b3b9-18f7eb3187ab" containerName="registry-server" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.138815 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="83b20f23-3180-413a-93a5-7cb32cb20289" containerName="mariadb-client-2" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.141608 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.169831 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w2bn"] Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.272654 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hdr4\" (UniqueName: \"kubernetes.io/projected/73a24012-232e-4241-81be-0e1e8b5f988c-kube-api-access-5hdr4\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.272951 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-utilities\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.273104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-catalog-content\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.374963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hdr4\" (UniqueName: \"kubernetes.io/projected/73a24012-232e-4241-81be-0e1e8b5f988c-kube-api-access-5hdr4\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.375124 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-utilities\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.375167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-catalog-content\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.375925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-catalog-content\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.375991 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-utilities\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.398153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hdr4\" (UniqueName: \"kubernetes.io/projected/73a24012-232e-4241-81be-0e1e8b5f988c-kube-api-access-5hdr4\") pod \"redhat-marketplace-2w2bn\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.519376 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:36 crc kubenswrapper[4749]: I1129 02:34:36.969533 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w2bn"] Nov 29 02:34:37 crc kubenswrapper[4749]: I1129 02:34:37.588975 4749 generic.go:334] "Generic (PLEG): container finished" podID="73a24012-232e-4241-81be-0e1e8b5f988c" containerID="a07fbbea936c8ac0bfbc4c2a52c57cd1097d000a7809132d7e3ad769210ceeeb" exitCode=0 Nov 29 02:34:37 crc kubenswrapper[4749]: I1129 02:34:37.589034 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w2bn" event={"ID":"73a24012-232e-4241-81be-0e1e8b5f988c","Type":"ContainerDied","Data":"a07fbbea936c8ac0bfbc4c2a52c57cd1097d000a7809132d7e3ad769210ceeeb"} Nov 29 02:34:37 crc kubenswrapper[4749]: I1129 02:34:37.589078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w2bn" event={"ID":"73a24012-232e-4241-81be-0e1e8b5f988c","Type":"ContainerStarted","Data":"758870bef032010d679340e06aea4b45b27c6aa947973012060ee4809c724afa"} Nov 29 02:34:38 crc kubenswrapper[4749]: I1129 02:34:38.075811 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:34:38 crc kubenswrapper[4749]: E1129 02:34:38.076451 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:34:38 crc kubenswrapper[4749]: I1129 02:34:38.600265 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w2bn" event={"ID":"73a24012-232e-4241-81be-0e1e8b5f988c","Type":"ContainerStarted","Data":"25b42774a36142b45ee9b87845c309ca46965dec19afb97eaa1314ae4dcc0e05"} Nov 29 02:34:39 crc kubenswrapper[4749]: I1129 02:34:39.615927 4749 generic.go:334] "Generic (PLEG): container finished" podID="73a24012-232e-4241-81be-0e1e8b5f988c" containerID="25b42774a36142b45ee9b87845c309ca46965dec19afb97eaa1314ae4dcc0e05" exitCode=0 Nov 29 02:34:39 crc kubenswrapper[4749]: I1129 02:34:39.615990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w2bn" event={"ID":"73a24012-232e-4241-81be-0e1e8b5f988c","Type":"ContainerDied","Data":"25b42774a36142b45ee9b87845c309ca46965dec19afb97eaa1314ae4dcc0e05"} Nov 29 02:34:40 crc kubenswrapper[4749]: I1129 02:34:40.650895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w2bn" event={"ID":"73a24012-232e-4241-81be-0e1e8b5f988c","Type":"ContainerStarted","Data":"30cc7e5991e8416317472fcbba4e5d0105b1cb689a37210545d32629f4b81e25"} Nov 29 02:34:40 crc kubenswrapper[4749]: I1129 02:34:40.679881 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2w2bn" podStartSLOduration=2.10854484 podStartE2EDuration="4.679861578s" podCreationTimestamp="2025-11-29 02:34:36 +0000 UTC" firstStartedPulling="2025-11-29 02:34:37.591394995 +0000 UTC m=+5020.763544882" lastFinishedPulling="2025-11-29 02:34:40.162711733 +0000 UTC m=+5023.334861620" observedRunningTime="2025-11-29 02:34:40.677554992 +0000 UTC m=+5023.849704930" watchObservedRunningTime="2025-11-29 02:34:40.679861578 +0000 UTC m=+5023.852011445" Nov 29 02:34:46 crc kubenswrapper[4749]: I1129 02:34:46.520305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:46 crc kubenswrapper[4749]: I1129 02:34:46.520758 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:47 crc kubenswrapper[4749]: I1129 02:34:47.066171 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:47 crc kubenswrapper[4749]: I1129 02:34:47.150654 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:47 crc kubenswrapper[4749]: I1129 02:34:47.322187 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w2bn"] Nov 29 02:34:48 crc kubenswrapper[4749]: I1129 02:34:48.769027 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2w2bn" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" containerName="registry-server" containerID="cri-o://30cc7e5991e8416317472fcbba4e5d0105b1cb689a37210545d32629f4b81e25" gracePeriod=2 Nov 29 02:34:49 crc kubenswrapper[4749]: I1129 02:34:49.075359 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:34:49 crc kubenswrapper[4749]: E1129 02:34:49.075748 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:34:49 crc kubenswrapper[4749]: I1129 02:34:49.783112 4749 generic.go:334] "Generic (PLEG): container finished" podID="73a24012-232e-4241-81be-0e1e8b5f988c" containerID="30cc7e5991e8416317472fcbba4e5d0105b1cb689a37210545d32629f4b81e25" exitCode=0 Nov 29 02:34:49 crc kubenswrapper[4749]: I1129 02:34:49.783174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w2bn" event={"ID":"73a24012-232e-4241-81be-0e1e8b5f988c","Type":"ContainerDied","Data":"30cc7e5991e8416317472fcbba4e5d0105b1cb689a37210545d32629f4b81e25"} Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.063709 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.158516 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-catalog-content\") pod \"73a24012-232e-4241-81be-0e1e8b5f988c\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.158629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-utilities\") pod \"73a24012-232e-4241-81be-0e1e8b5f988c\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.159462 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hdr4\" (UniqueName: \"kubernetes.io/projected/73a24012-232e-4241-81be-0e1e8b5f988c-kube-api-access-5hdr4\") pod \"73a24012-232e-4241-81be-0e1e8b5f988c\" (UID: \"73a24012-232e-4241-81be-0e1e8b5f988c\") " Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.159789 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-utilities" (OuterVolumeSpecName: "utilities") pod "73a24012-232e-4241-81be-0e1e8b5f988c" (UID: "73a24012-232e-4241-81be-0e1e8b5f988c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.160263 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.165769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a24012-232e-4241-81be-0e1e8b5f988c-kube-api-access-5hdr4" (OuterVolumeSpecName: "kube-api-access-5hdr4") pod "73a24012-232e-4241-81be-0e1e8b5f988c" (UID: "73a24012-232e-4241-81be-0e1e8b5f988c"). InnerVolumeSpecName "kube-api-access-5hdr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.194588 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73a24012-232e-4241-81be-0e1e8b5f988c" (UID: "73a24012-232e-4241-81be-0e1e8b5f988c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.262307 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hdr4\" (UniqueName: \"kubernetes.io/projected/73a24012-232e-4241-81be-0e1e8b5f988c-kube-api-access-5hdr4\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.262346 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73a24012-232e-4241-81be-0e1e8b5f988c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.797878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2w2bn" event={"ID":"73a24012-232e-4241-81be-0e1e8b5f988c","Type":"ContainerDied","Data":"758870bef032010d679340e06aea4b45b27c6aa947973012060ee4809c724afa"} Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.798619 4749 scope.go:117] "RemoveContainer" containerID="30cc7e5991e8416317472fcbba4e5d0105b1cb689a37210545d32629f4b81e25" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.798503 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2w2bn" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.851843 4749 scope.go:117] "RemoveContainer" containerID="25b42774a36142b45ee9b87845c309ca46965dec19afb97eaa1314ae4dcc0e05" Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.860158 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w2bn"] Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.874842 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2w2bn"] Nov 29 02:34:50 crc kubenswrapper[4749]: I1129 02:34:50.884705 4749 scope.go:117] "RemoveContainer" containerID="a07fbbea936c8ac0bfbc4c2a52c57cd1097d000a7809132d7e3ad769210ceeeb" Nov 29 02:34:51 crc kubenswrapper[4749]: I1129 02:34:51.089612 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" path="/var/lib/kubelet/pods/73a24012-232e-4241-81be-0e1e8b5f988c/volumes" Nov 29 02:35:03 crc kubenswrapper[4749]: I1129 02:35:03.075580 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:35:03 crc kubenswrapper[4749]: E1129 02:35:03.076756 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:35:15 crc kubenswrapper[4749]: I1129 02:35:15.075561 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:35:15 crc kubenswrapper[4749]: E1129 02:35:15.076487 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:35:27 crc kubenswrapper[4749]: I1129 02:35:27.078267 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:35:27 crc kubenswrapper[4749]: E1129 02:35:27.078944 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:35:40 crc kubenswrapper[4749]: I1129 02:35:40.075898 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:35:40 crc kubenswrapper[4749]: E1129 02:35:40.076933 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:35:51 crc kubenswrapper[4749]: I1129 02:35:51.075677 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:35:51 crc kubenswrapper[4749]: E1129 02:35:51.076616 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:36:02 crc kubenswrapper[4749]: I1129 02:36:02.075326 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:36:02 crc kubenswrapper[4749]: E1129 02:36:02.076352 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:36:15 crc kubenswrapper[4749]: I1129 02:36:15.075485 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:36:15 crc kubenswrapper[4749]: E1129 02:36:15.076752 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:36:27 crc kubenswrapper[4749]: I1129 02:36:27.083733 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:36:27 crc kubenswrapper[4749]: E1129 02:36:27.084392 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:36:41 crc kubenswrapper[4749]: I1129 02:36:41.075124 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:36:41 crc kubenswrapper[4749]: E1129 02:36:41.076305 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:36:55 crc kubenswrapper[4749]: I1129 02:36:55.075521 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:36:55 crc kubenswrapper[4749]: E1129 02:36:55.076444 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:37:09 crc kubenswrapper[4749]: I1129 02:37:09.076176 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:37:09 crc kubenswrapper[4749]: E1129 02:37:09.077323 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:37:20 crc kubenswrapper[4749]: I1129 02:37:20.075659 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:37:20 crc kubenswrapper[4749]: E1129 02:37:20.077189 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:37:35 crc kubenswrapper[4749]: I1129 02:37:35.076550 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:37:35 crc kubenswrapper[4749]: E1129 02:37:35.079255 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:37:46 crc kubenswrapper[4749]: I1129 02:37:46.075073 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:37:46 crc kubenswrapper[4749]: E1129 02:37:46.075980 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:37:57 crc kubenswrapper[4749]: I1129 02:37:57.082310 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:37:57 crc kubenswrapper[4749]: E1129 02:37:57.083665 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.640775 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4mkkp"] Nov 29 02:38:03 crc kubenswrapper[4749]: E1129 02:38:03.641716 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" containerName="extract-content" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.641733 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" containerName="extract-content" Nov 29 02:38:03 crc kubenswrapper[4749]: E1129 02:38:03.641755 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" containerName="extract-utilities" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.641763 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" containerName="extract-utilities" Nov 29 02:38:03 crc kubenswrapper[4749]: E1129 02:38:03.641775 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" containerName="registry-server" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.641783 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" containerName="registry-server" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.641993 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a24012-232e-4241-81be-0e1e8b5f988c" containerName="registry-server" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.646329 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.667376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hns78\" (UniqueName: \"kubernetes.io/projected/9d831e1b-da21-43da-82b5-a513c1d1f261-kube-api-access-hns78\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.667584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-utilities\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.667746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-catalog-content\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.678819 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4mkkp"] Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.768716 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-utilities\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.768790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-catalog-content\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.768872 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hns78\" (UniqueName: \"kubernetes.io/projected/9d831e1b-da21-43da-82b5-a513c1d1f261-kube-api-access-hns78\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.769598 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-utilities\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.769655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-catalog-content\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.791321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hns78\" (UniqueName: \"kubernetes.io/projected/9d831e1b-da21-43da-82b5-a513c1d1f261-kube-api-access-hns78\") pod \"redhat-operators-4mkkp\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:03 crc kubenswrapper[4749]: I1129 02:38:03.983629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:04 crc kubenswrapper[4749]: I1129 02:38:04.215359 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4mkkp"] Nov 29 02:38:04 crc kubenswrapper[4749]: I1129 02:38:04.758496 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerID="ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd" exitCode=0 Nov 29 02:38:04 crc kubenswrapper[4749]: I1129 02:38:04.758544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mkkp" event={"ID":"9d831e1b-da21-43da-82b5-a513c1d1f261","Type":"ContainerDied","Data":"ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd"} Nov 29 02:38:04 crc kubenswrapper[4749]: I1129 02:38:04.758717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mkkp" event={"ID":"9d831e1b-da21-43da-82b5-a513c1d1f261","Type":"ContainerStarted","Data":"b56e488a1c230b0cb906d39b915b3b8011b0402b1fce863f827384ac8c0a7cd6"} Nov 29 02:38:04 crc kubenswrapper[4749]: I1129 02:38:04.760308 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 02:38:05 crc kubenswrapper[4749]: I1129 02:38:05.772533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mkkp" event={"ID":"9d831e1b-da21-43da-82b5-a513c1d1f261","Type":"ContainerStarted","Data":"144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a"} Nov 29 02:38:06 crc kubenswrapper[4749]: I1129 02:38:06.787913 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerID="144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a" exitCode=0 Nov 29 02:38:06 crc kubenswrapper[4749]: I1129 02:38:06.788025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mkkp" event={"ID":"9d831e1b-da21-43da-82b5-a513c1d1f261","Type":"ContainerDied","Data":"144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a"} Nov 29 02:38:07 crc kubenswrapper[4749]: I1129 02:38:07.803150 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mkkp" event={"ID":"9d831e1b-da21-43da-82b5-a513c1d1f261","Type":"ContainerStarted","Data":"9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253"} Nov 29 02:38:07 crc kubenswrapper[4749]: I1129 02:38:07.834247 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4mkkp" podStartSLOduration=2.057166486 podStartE2EDuration="4.834222703s" podCreationTimestamp="2025-11-29 02:38:03 +0000 UTC" firstStartedPulling="2025-11-29 02:38:04.760045195 +0000 UTC m=+5227.932195052" lastFinishedPulling="2025-11-29 02:38:07.537101382 +0000 UTC m=+5230.709251269" observedRunningTime="2025-11-29 02:38:07.831327143 +0000 UTC m=+5231.003477040" watchObservedRunningTime="2025-11-29 02:38:07.834222703 +0000 UTC m=+5231.006372600" Nov 29 02:38:12 crc kubenswrapper[4749]: I1129 02:38:12.077351 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:38:12 crc kubenswrapper[4749]: E1129 02:38:12.079692 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:38:13 crc kubenswrapper[4749]: I1129 02:38:13.984813 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:13 crc kubenswrapper[4749]: I1129 02:38:13.984875 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:15 crc kubenswrapper[4749]: I1129 02:38:15.051648 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4mkkp" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="registry-server" probeResult="failure" output=< Nov 29 02:38:15 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 02:38:15 crc kubenswrapper[4749]: > Nov 29 02:38:24 crc kubenswrapper[4749]: I1129 02:38:24.070017 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:24 crc kubenswrapper[4749]: I1129 02:38:24.161455 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:24 crc kubenswrapper[4749]: I1129 02:38:24.319835 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4mkkp"] Nov 29 02:38:25 crc kubenswrapper[4749]: I1129 02:38:25.978312 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4mkkp" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="registry-server" containerID="cri-o://9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253" gracePeriod=2 Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.079792 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:38:27 crc kubenswrapper[4749]: E1129 02:38:27.081258 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.487866 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.501800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hns78\" (UniqueName: \"kubernetes.io/projected/9d831e1b-da21-43da-82b5-a513c1d1f261-kube-api-access-hns78\") pod \"9d831e1b-da21-43da-82b5-a513c1d1f261\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.501973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-catalog-content\") pod \"9d831e1b-da21-43da-82b5-a513c1d1f261\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.502059 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-utilities\") pod \"9d831e1b-da21-43da-82b5-a513c1d1f261\" (UID: \"9d831e1b-da21-43da-82b5-a513c1d1f261\") " Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.502834 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-utilities" (OuterVolumeSpecName: "utilities") pod "9d831e1b-da21-43da-82b5-a513c1d1f261" (UID: "9d831e1b-da21-43da-82b5-a513c1d1f261"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.518115 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d831e1b-da21-43da-82b5-a513c1d1f261-kube-api-access-hns78" (OuterVolumeSpecName: "kube-api-access-hns78") pod "9d831e1b-da21-43da-82b5-a513c1d1f261" (UID: "9d831e1b-da21-43da-82b5-a513c1d1f261"). InnerVolumeSpecName "kube-api-access-hns78". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.603500 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hns78\" (UniqueName: \"kubernetes.io/projected/9d831e1b-da21-43da-82b5-a513c1d1f261-kube-api-access-hns78\") on node \"crc\" DevicePath \"\"" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.603535 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.640703 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d831e1b-da21-43da-82b5-a513c1d1f261" (UID: "9d831e1b-da21-43da-82b5-a513c1d1f261"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.706067 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d831e1b-da21-43da-82b5-a513c1d1f261-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.993828 4749 generic.go:334] "Generic (PLEG): container finished" podID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerID="9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253" exitCode=0 Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.993879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mkkp" event={"ID":"9d831e1b-da21-43da-82b5-a513c1d1f261","Type":"ContainerDied","Data":"9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253"} Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.993915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4mkkp" event={"ID":"9d831e1b-da21-43da-82b5-a513c1d1f261","Type":"ContainerDied","Data":"b56e488a1c230b0cb906d39b915b3b8011b0402b1fce863f827384ac8c0a7cd6"} Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.993944 4749 scope.go:117] "RemoveContainer" containerID="9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253" Nov 29 02:38:27 crc kubenswrapper[4749]: I1129 02:38:27.994134 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4mkkp" Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.012906 4749 scope.go:117] "RemoveContainer" containerID="144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a" Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.039269 4749 scope.go:117] "RemoveContainer" containerID="ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd" Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.039479 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4mkkp"] Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.050143 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4mkkp"] Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.087732 4749 scope.go:117] "RemoveContainer" containerID="9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253" Nov 29 02:38:28 crc kubenswrapper[4749]: E1129 02:38:28.089472 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253\": container with ID starting with 9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253 not found: ID does not exist" containerID="9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253" Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.089513 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253"} err="failed to get container status \"9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253\": rpc error: code = NotFound desc = could not find container \"9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253\": container with ID starting with 9bcd46b11818af731da57d88b5d055c4075f3b226b6c3610685c507cd1dfc253 not found: ID does not exist" Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.089538 4749 scope.go:117] "RemoveContainer" containerID="144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a" Nov 29 02:38:28 crc kubenswrapper[4749]: E1129 02:38:28.089954 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a\": container with ID starting with 144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a not found: ID does not exist" containerID="144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a" Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.090007 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a"} err="failed to get container status \"144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a\": rpc error: code = NotFound desc = could not find container \"144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a\": container with ID starting with 144be272ad7acf01cbb8b64a02311bf0cbf138da2c7beb31bb73feea710ba72a not found: ID does not exist" Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.090042 4749 scope.go:117] "RemoveContainer" containerID="ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd" Nov 29 02:38:28 crc kubenswrapper[4749]: E1129 02:38:28.090438 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd\": container with ID starting with ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd not found: ID does not exist" containerID="ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd" Nov 29 02:38:28 crc kubenswrapper[4749]: I1129 02:38:28.090464 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd"} err="failed to get container status \"ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd\": rpc error: code = NotFound desc = could not find container \"ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd\": container with ID starting with ac2e15f3b3692fe7dae3765b2d67b97745bdeabe936446bef5adc26f5c4062bd not found: ID does not exist" Nov 29 02:38:29 crc kubenswrapper[4749]: I1129 02:38:29.091702 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" path="/var/lib/kubelet/pods/9d831e1b-da21-43da-82b5-a513c1d1f261/volumes" Nov 29 02:38:38 crc kubenswrapper[4749]: I1129 02:38:38.074715 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:38:38 crc kubenswrapper[4749]: E1129 02:38:38.075483 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:38:53 crc kubenswrapper[4749]: I1129 02:38:53.075687 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:38:53 crc kubenswrapper[4749]: E1129 02:38:53.076651 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:39:06 crc kubenswrapper[4749]: I1129 02:39:06.074887 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:39:06 crc kubenswrapper[4749]: E1129 02:39:06.075793 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:39:18 crc kubenswrapper[4749]: I1129 02:39:18.075874 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:39:18 crc kubenswrapper[4749]: E1129 02:39:18.076889 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.170091 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Nov 29 02:39:22 crc kubenswrapper[4749]: E1129 02:39:22.171760 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="extract-utilities" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.171792 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="extract-utilities" Nov 29 02:39:22 crc kubenswrapper[4749]: E1129 02:39:22.171827 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="registry-server" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.171845 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="registry-server" Nov 29 02:39:22 crc kubenswrapper[4749]: E1129 02:39:22.171911 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="extract-content" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.171926 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="extract-content" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.175591 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d831e1b-da21-43da-82b5-a513c1d1f261" containerName="registry-server" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.176766 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.182895 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.185887 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-j85bz" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.330350 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\") pod \"mariadb-copy-data\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") " pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.330445 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xllz7\" (UniqueName: \"kubernetes.io/projected/0dea6975-e5ed-4016-b418-88a79f113bd4-kube-api-access-xllz7\") pod \"mariadb-copy-data\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") " pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.432448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\") pod \"mariadb-copy-data\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") " pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.432530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xllz7\" (UniqueName: \"kubernetes.io/projected/0dea6975-e5ed-4016-b418-88a79f113bd4-kube-api-access-xllz7\") pod \"mariadb-copy-data\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") " pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.436727 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.436818 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\") pod \"mariadb-copy-data\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33c6349f24018bd3318b56991370f1d862e3b643415f6cfdd86c9fdf1f416ba9/globalmount\"" pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.468879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xllz7\" (UniqueName: \"kubernetes.io/projected/0dea6975-e5ed-4016-b418-88a79f113bd4-kube-api-access-xllz7\") pod \"mariadb-copy-data\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") " pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.475606 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\") pod \"mariadb-copy-data\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") " pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.504699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 29 02:39:22 crc kubenswrapper[4749]: I1129 02:39:22.848605 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Nov 29 02:39:23 crc kubenswrapper[4749]: I1129 02:39:23.608295 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0dea6975-e5ed-4016-b418-88a79f113bd4","Type":"ContainerStarted","Data":"71226f11aa10a347fc1f88c1de493850783c83ebc6f7eacd0784c6b4dc6a1aa8"} Nov 29 02:39:23 crc kubenswrapper[4749]: I1129 02:39:23.608749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0dea6975-e5ed-4016-b418-88a79f113bd4","Type":"ContainerStarted","Data":"e1c43d34fb020911a8f49fe71f11e289341cc546519bdfb9679d2f13ca84ee01"} Nov 29 02:39:23 crc kubenswrapper[4749]: I1129 02:39:23.627458 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.627432187 podStartE2EDuration="2.627432187s" podCreationTimestamp="2025-11-29 02:39:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:39:23.622098228 +0000 UTC m=+5306.794248095" watchObservedRunningTime="2025-11-29 02:39:23.627432187 +0000 UTC m=+5306.799582054" Nov 29 02:39:26 crc kubenswrapper[4749]: I1129 02:39:26.764085 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:26 crc kubenswrapper[4749]: I1129 02:39:26.767050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 29 02:39:26 crc kubenswrapper[4749]: I1129 02:39:26.780716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:26 crc kubenswrapper[4749]: I1129 02:39:26.957240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865lr\" (UniqueName: \"kubernetes.io/projected/ee72a323-297b-48ba-8d88-12dfed360e39-kube-api-access-865lr\") pod \"mariadb-client\" (UID: \"ee72a323-297b-48ba-8d88-12dfed360e39\") " pod="openstack/mariadb-client" Nov 29 02:39:27 crc kubenswrapper[4749]: I1129 02:39:27.059903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-865lr\" (UniqueName: \"kubernetes.io/projected/ee72a323-297b-48ba-8d88-12dfed360e39-kube-api-access-865lr\") pod \"mariadb-client\" (UID: \"ee72a323-297b-48ba-8d88-12dfed360e39\") " pod="openstack/mariadb-client" Nov 29 02:39:27 crc kubenswrapper[4749]: I1129 02:39:27.113764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-865lr\" (UniqueName: \"kubernetes.io/projected/ee72a323-297b-48ba-8d88-12dfed360e39-kube-api-access-865lr\") pod \"mariadb-client\" (UID: \"ee72a323-297b-48ba-8d88-12dfed360e39\") " pod="openstack/mariadb-client" Nov 29 02:39:27 crc kubenswrapper[4749]: I1129 02:39:27.400779 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 29 02:39:27 crc kubenswrapper[4749]: I1129 02:39:27.956552 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:28 crc kubenswrapper[4749]: I1129 02:39:28.657985 4749 generic.go:334] "Generic (PLEG): container finished" podID="ee72a323-297b-48ba-8d88-12dfed360e39" containerID="24b1bd9958e0b0112f79662b2ec83e0e7b4d598f8f264e5f318f3894619ddbed" exitCode=0 Nov 29 02:39:28 crc kubenswrapper[4749]: I1129 02:39:28.658036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ee72a323-297b-48ba-8d88-12dfed360e39","Type":"ContainerDied","Data":"24b1bd9958e0b0112f79662b2ec83e0e7b4d598f8f264e5f318f3894619ddbed"} Nov 29 02:39:28 crc kubenswrapper[4749]: I1129 02:39:28.658076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ee72a323-297b-48ba-8d88-12dfed360e39","Type":"ContainerStarted","Data":"7b0d3b818a4b518c7b21a8d8c8ee815b9bad8755333ed5bc4a0776b3ce47a38d"} Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.075335 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.092936 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.131678 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ee72a323-297b-48ba-8d88-12dfed360e39/mariadb-client/0.log" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.161993 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.173702 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.222253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-865lr\" (UniqueName: \"kubernetes.io/projected/ee72a323-297b-48ba-8d88-12dfed360e39-kube-api-access-865lr\") pod \"ee72a323-297b-48ba-8d88-12dfed360e39\" (UID: \"ee72a323-297b-48ba-8d88-12dfed360e39\") " Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.230555 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee72a323-297b-48ba-8d88-12dfed360e39-kube-api-access-865lr" (OuterVolumeSpecName: "kube-api-access-865lr") pod "ee72a323-297b-48ba-8d88-12dfed360e39" (UID: "ee72a323-297b-48ba-8d88-12dfed360e39"). InnerVolumeSpecName "kube-api-access-865lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.315166 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:30 crc kubenswrapper[4749]: E1129 02:39:30.315570 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee72a323-297b-48ba-8d88-12dfed360e39" containerName="mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.315591 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee72a323-297b-48ba-8d88-12dfed360e39" containerName="mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.315802 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee72a323-297b-48ba-8d88-12dfed360e39" containerName="mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.316499 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.326125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldnc\" (UniqueName: \"kubernetes.io/projected/0281c89f-2372-4870-a380-23c223bbd12e-kube-api-access-qldnc\") pod \"mariadb-client\" (UID: \"0281c89f-2372-4870-a380-23c223bbd12e\") " pod="openstack/mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.327622 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-865lr\" (UniqueName: \"kubernetes.io/projected/ee72a323-297b-48ba-8d88-12dfed360e39-kube-api-access-865lr\") on node \"crc\" DevicePath \"\"" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.345260 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.429597 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldnc\" (UniqueName: \"kubernetes.io/projected/0281c89f-2372-4870-a380-23c223bbd12e-kube-api-access-qldnc\") pod \"mariadb-client\" (UID: \"0281c89f-2372-4870-a380-23c223bbd12e\") " pod="openstack/mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.454435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldnc\" (UniqueName: \"kubernetes.io/projected/0281c89f-2372-4870-a380-23c223bbd12e-kube-api-access-qldnc\") pod \"mariadb-client\" (UID: \"0281c89f-2372-4870-a380-23c223bbd12e\") " pod="openstack/mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.648466 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.682666 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b0d3b818a4b518c7b21a8d8c8ee815b9bad8755333ed5bc4a0776b3ce47a38d" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.682693 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.685500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"0f517e2447b6626d440180682d23cb7d40e59343ddf40e5b871367149339d323"} Nov 29 02:39:30 crc kubenswrapper[4749]: I1129 02:39:30.733074 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="ee72a323-297b-48ba-8d88-12dfed360e39" podUID="0281c89f-2372-4870-a380-23c223bbd12e" Nov 29 02:39:31 crc kubenswrapper[4749]: I1129 02:39:31.088639 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee72a323-297b-48ba-8d88-12dfed360e39" path="/var/lib/kubelet/pods/ee72a323-297b-48ba-8d88-12dfed360e39/volumes" Nov 29 02:39:31 crc kubenswrapper[4749]: I1129 02:39:31.191161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:31 crc kubenswrapper[4749]: W1129 02:39:31.197351 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0281c89f_2372_4870_a380_23c223bbd12e.slice/crio-cc8e06c013d69b36cef6b4ddb7c55cd2af1c878020ec98ad3ca560cd7f0899f2 WatchSource:0}: Error finding container cc8e06c013d69b36cef6b4ddb7c55cd2af1c878020ec98ad3ca560cd7f0899f2: Status 404 returned error can't find the container with id cc8e06c013d69b36cef6b4ddb7c55cd2af1c878020ec98ad3ca560cd7f0899f2 Nov 29 02:39:31 crc kubenswrapper[4749]: I1129 02:39:31.706626 4749 generic.go:334] "Generic (PLEG): container finished" podID="0281c89f-2372-4870-a380-23c223bbd12e" containerID="95f8e8edb0d102394907d307dd76ecf034839055ea28647f3ab96c6ca6383bdf" exitCode=0 Nov 29 02:39:31 crc kubenswrapper[4749]: I1129 02:39:31.706918 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0281c89f-2372-4870-a380-23c223bbd12e","Type":"ContainerDied","Data":"95f8e8edb0d102394907d307dd76ecf034839055ea28647f3ab96c6ca6383bdf"} Nov 29 02:39:31 crc kubenswrapper[4749]: I1129 02:39:31.706944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0281c89f-2372-4870-a380-23c223bbd12e","Type":"ContainerStarted","Data":"cc8e06c013d69b36cef6b4ddb7c55cd2af1c878020ec98ad3ca560cd7f0899f2"} Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.108373 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.133576 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_0281c89f-2372-4870-a380-23c223bbd12e/mariadb-client/0.log" Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.171446 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.179049 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.274720 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldnc\" (UniqueName: \"kubernetes.io/projected/0281c89f-2372-4870-a380-23c223bbd12e-kube-api-access-qldnc\") pod \"0281c89f-2372-4870-a380-23c223bbd12e\" (UID: \"0281c89f-2372-4870-a380-23c223bbd12e\") " Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.285430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0281c89f-2372-4870-a380-23c223bbd12e-kube-api-access-qldnc" (OuterVolumeSpecName: "kube-api-access-qldnc") pod "0281c89f-2372-4870-a380-23c223bbd12e" (UID: "0281c89f-2372-4870-a380-23c223bbd12e"). InnerVolumeSpecName "kube-api-access-qldnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.377187 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qldnc\" (UniqueName: \"kubernetes.io/projected/0281c89f-2372-4870-a380-23c223bbd12e-kube-api-access-qldnc\") on node \"crc\" DevicePath \"\"" Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.731320 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc8e06c013d69b36cef6b4ddb7c55cd2af1c878020ec98ad3ca560cd7f0899f2" Nov 29 02:39:33 crc kubenswrapper[4749]: I1129 02:39:33.731419 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 29 02:39:35 crc kubenswrapper[4749]: I1129 02:39:35.092342 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0281c89f-2372-4870-a380-23c223bbd12e" path="/var/lib/kubelet/pods/0281c89f-2372-4870-a380-23c223bbd12e/volumes" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.851303 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 02:40:03 crc kubenswrapper[4749]: E1129 02:40:03.852488 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0281c89f-2372-4870-a380-23c223bbd12e" containerName="mariadb-client" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.852516 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0281c89f-2372-4870-a380-23c223bbd12e" containerName="mariadb-client" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.852943 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0281c89f-2372-4870-a380-23c223bbd12e" containerName="mariadb-client" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.854670 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.857542 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.858192 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.859192 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-jqz7c" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.873679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.890004 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.892435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.894268 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.896225 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.916643 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 29 02:40:03 crc kubenswrapper[4749]: I1129 02:40:03.936994 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.052709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d50cec-a30f-49a1-857b-3181d5d1e632-config\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.052804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d50cec-a30f-49a1-857b-3181d5d1e632-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.052860 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5702b1-caee-424f-b2ba-e62faf326574-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.052936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vmlq\" (UniqueName: \"kubernetes.io/projected/a4d50cec-a30f-49a1-857b-3181d5d1e632-kube-api-access-9vmlq\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.052995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llxtc\" (UniqueName: \"kubernetes.io/projected/a12be827-924f-4ff4-8fba-c2e78d1222d0-kube-api-access-llxtc\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053043 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12be827-924f-4ff4-8fba-c2e78d1222d0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5702b1-caee-424f-b2ba-e62faf326574-config\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4d50cec-a30f-49a1-857b-3181d5d1e632-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5702b1-caee-424f-b2ba-e62faf326574-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b9685de5-b542-49c7-9351-b3df93d99600\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b9685de5-b542-49c7-9351-b3df93d99600\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a12be827-924f-4ff4-8fba-c2e78d1222d0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053502 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a3caaa9a-39ea-4c72-8395-7e90e584d4ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a3caaa9a-39ea-4c72-8395-7e90e584d4ea\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053590 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a12be827-924f-4ff4-8fba-c2e78d1222d0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4d50cec-a30f-49a1-857b-3181d5d1e632-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053902 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scl24\" (UniqueName: \"kubernetes.io/projected/5a5702b1-caee-424f-b2ba-e62faf326574-kube-api-access-scl24\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.053995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12be827-924f-4ff4-8fba-c2e78d1222d0-config\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.054115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5702b1-caee-424f-b2ba-e62faf326574-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.054266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b6e74288-b2a6-42f2-8707-b040ff5f6b78\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6e74288-b2a6-42f2-8707-b040ff5f6b78\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.075345 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.078380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.086637 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.087141 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.088115 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xsg55" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.099086 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.108368 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.109698 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.121439 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.123178 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.136280 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.155639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b6e74288-b2a6-42f2-8707-b040ff5f6b78\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6e74288-b2a6-42f2-8707-b040ff5f6b78\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d50cec-a30f-49a1-857b-3181d5d1e632-config\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157041 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5702b1-caee-424f-b2ba-e62faf326574-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157087 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d50cec-a30f-49a1-857b-3181d5d1e632-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157153 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vmlq\" (UniqueName: \"kubernetes.io/projected/a4d50cec-a30f-49a1-857b-3181d5d1e632-kube-api-access-9vmlq\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llxtc\" (UniqueName: \"kubernetes.io/projected/a12be827-924f-4ff4-8fba-c2e78d1222d0-kube-api-access-llxtc\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157216 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12be827-924f-4ff4-8fba-c2e78d1222d0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5702b1-caee-424f-b2ba-e62faf326574-config\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157262 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4d50cec-a30f-49a1-857b-3181d5d1e632-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5702b1-caee-424f-b2ba-e62faf326574-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b9685de5-b542-49c7-9351-b3df93d99600\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b9685de5-b542-49c7-9351-b3df93d99600\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a12be827-924f-4ff4-8fba-c2e78d1222d0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a3caaa9a-39ea-4c72-8395-7e90e584d4ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a3caaa9a-39ea-4c72-8395-7e90e584d4ea\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a12be827-924f-4ff4-8fba-c2e78d1222d0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.157990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4d50cec-a30f-49a1-857b-3181d5d1e632-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.158184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scl24\" (UniqueName: \"kubernetes.io/projected/5a5702b1-caee-424f-b2ba-e62faf326574-kube-api-access-scl24\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.158325 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12be827-924f-4ff4-8fba-c2e78d1222d0-config\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.158491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5702b1-caee-424f-b2ba-e62faf326574-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.159876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5702b1-caee-424f-b2ba-e62faf326574-config\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.160618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5702b1-caee-424f-b2ba-e62faf326574-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.160973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a12be827-924f-4ff4-8fba-c2e78d1222d0-config\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.161443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d50cec-a30f-49a1-857b-3181d5d1e632-config\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.161582 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a12be827-924f-4ff4-8fba-c2e78d1222d0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.161693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4d50cec-a30f-49a1-857b-3181d5d1e632-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.161824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5702b1-caee-424f-b2ba-e62faf326574-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.162653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4d50cec-a30f-49a1-857b-3181d5d1e632-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.162707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a12be827-924f-4ff4-8fba-c2e78d1222d0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.167287 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.167321 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a3caaa9a-39ea-4c72-8395-7e90e584d4ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a3caaa9a-39ea-4c72-8395-7e90e584d4ea\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/de270adb41c05568bbd7cb92f8d0e13662fe8821ffb53d4b14540533acc54d22/globalmount\"" pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.167331 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.167412 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b9685de5-b542-49c7-9351-b3df93d99600\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b9685de5-b542-49c7-9351-b3df93d99600\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3eb5a1b0bac7284812e9324d8e9c1c4acf9ee55efc575d73fb4e581d79e4cccc/globalmount\"" pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.167929 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.167982 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b6e74288-b2a6-42f2-8707-b040ff5f6b78\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6e74288-b2a6-42f2-8707-b040ff5f6b78\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/aa30ecf9072ead5f5d10174b7af9a8cac3d64bf5a92bfac70dcdf4b4a06670bb/globalmount\"" pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.172071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d50cec-a30f-49a1-857b-3181d5d1e632-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.176512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12be827-924f-4ff4-8fba-c2e78d1222d0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.179673 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5702b1-caee-424f-b2ba-e62faf326574-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.179751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llxtc\" (UniqueName: \"kubernetes.io/projected/a12be827-924f-4ff4-8fba-c2e78d1222d0-kube-api-access-llxtc\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.194900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scl24\" (UniqueName: \"kubernetes.io/projected/5a5702b1-caee-424f-b2ba-e62faf326574-kube-api-access-scl24\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.201962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vmlq\" (UniqueName: \"kubernetes.io/projected/a4d50cec-a30f-49a1-857b-3181d5d1e632-kube-api-access-9vmlq\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.216492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a3caaa9a-39ea-4c72-8395-7e90e584d4ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a3caaa9a-39ea-4c72-8395-7e90e584d4ea\") pod \"ovsdbserver-sb-2\" (UID: \"5a5702b1-caee-424f-b2ba-e62faf326574\") " pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.226673 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b9685de5-b542-49c7-9351-b3df93d99600\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b9685de5-b542-49c7-9351-b3df93d99600\") pod \"ovsdbserver-sb-0\" (UID: \"a12be827-924f-4ff4-8fba-c2e78d1222d0\") " pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.227109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b6e74288-b2a6-42f2-8707-b040ff5f6b78\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b6e74288-b2a6-42f2-8707-b040ff5f6b78\") pod \"ovsdbserver-sb-1\" (UID: \"a4d50cec-a30f-49a1-857b-3181d5d1e632\") " pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.253042 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.259890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pjks\" (UniqueName: \"kubernetes.io/projected/30163aa0-30e4-4c0e-a703-47bb8a18bf07-kube-api-access-6pjks\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.259955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a416ad5-b3da-4bd9-949f-23485a7d2647-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30163aa0-30e4-4c0e-a703-47bb8a18bf07-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6665f4d3-5e86-4e4e-af41-9574adad9b2d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e9d8bec9-2ad9-49f1-98f4-f8d865e49e49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9d8bec9-2ad9-49f1-98f4-f8d865e49e49\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30163aa0-30e4-4c0e-a703-47bb8a18bf07-config\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6665f4d3-5e86-4e4e-af41-9574adad9b2d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260379 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a416ad5-b3da-4bd9-949f-23485a7d2647-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6665f4d3-5e86-4e4e-af41-9574adad9b2d-config\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6665f4d3-5e86-4e4e-af41-9574adad9b2d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260470 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a416ad5-b3da-4bd9-949f-23485a7d2647-config\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a416ad5-b3da-4bd9-949f-23485a7d2647-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30163aa0-30e4-4c0e-a703-47bb8a18bf07-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30163aa0-30e4-4c0e-a703-47bb8a18bf07-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-00cfcbaa-f21a-4033-a664-b79f20037aa9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-00cfcbaa-f21a-4033-a664-b79f20037aa9\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260911 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8687\" (UniqueName: \"kubernetes.io/projected/6665f4d3-5e86-4e4e-af41-9574adad9b2d-kube-api-access-b8687\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6d7e029-ccdc-4a78-8a00-a20bfbb50141\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6d7e029-ccdc-4a78-8a00-a20bfbb50141\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.260985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64p6q\" (UniqueName: \"kubernetes.io/projected/3a416ad5-b3da-4bd9-949f-23485a7d2647-kube-api-access-64p6q\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.269346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6d7e029-ccdc-4a78-8a00-a20bfbb50141\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6d7e029-ccdc-4a78-8a00-a20bfbb50141\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64p6q\" (UniqueName: \"kubernetes.io/projected/3a416ad5-b3da-4bd9-949f-23485a7d2647-kube-api-access-64p6q\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365327 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pjks\" (UniqueName: \"kubernetes.io/projected/30163aa0-30e4-4c0e-a703-47bb8a18bf07-kube-api-access-6pjks\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a416ad5-b3da-4bd9-949f-23485a7d2647-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30163aa0-30e4-4c0e-a703-47bb8a18bf07-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6665f4d3-5e86-4e4e-af41-9574adad9b2d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365598 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e9d8bec9-2ad9-49f1-98f4-f8d865e49e49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9d8bec9-2ad9-49f1-98f4-f8d865e49e49\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30163aa0-30e4-4c0e-a703-47bb8a18bf07-config\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6665f4d3-5e86-4e4e-af41-9574adad9b2d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365746 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a416ad5-b3da-4bd9-949f-23485a7d2647-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6665f4d3-5e86-4e4e-af41-9574adad9b2d-config\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6665f4d3-5e86-4e4e-af41-9574adad9b2d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a416ad5-b3da-4bd9-949f-23485a7d2647-config\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a416ad5-b3da-4bd9-949f-23485a7d2647-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30163aa0-30e4-4c0e-a703-47bb8a18bf07-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.365979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30163aa0-30e4-4c0e-a703-47bb8a18bf07-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.366024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-00cfcbaa-f21a-4033-a664-b79f20037aa9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-00cfcbaa-f21a-4033-a664-b79f20037aa9\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.366066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8687\" (UniqueName: \"kubernetes.io/projected/6665f4d3-5e86-4e4e-af41-9574adad9b2d-kube-api-access-b8687\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.366590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6665f4d3-5e86-4e4e-af41-9574adad9b2d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.366957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30163aa0-30e4-4c0e-a703-47bb8a18bf07-config\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.367845 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a416ad5-b3da-4bd9-949f-23485a7d2647-config\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.368315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30163aa0-30e4-4c0e-a703-47bb8a18bf07-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.368529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a416ad5-b3da-4bd9-949f-23485a7d2647-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.369087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a416ad5-b3da-4bd9-949f-23485a7d2647-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.369296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6665f4d3-5e86-4e4e-af41-9574adad9b2d-config\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.369997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30163aa0-30e4-4c0e-a703-47bb8a18bf07-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.370338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6665f4d3-5e86-4e4e-af41-9574adad9b2d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.373452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30163aa0-30e4-4c0e-a703-47bb8a18bf07-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.373859 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.373871 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.373892 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e9d8bec9-2ad9-49f1-98f4-f8d865e49e49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9d8bec9-2ad9-49f1-98f4-f8d865e49e49\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3f579cc5748514e6566fdf61977129c868265904fe84b6dbd7338760c41cb5aa/globalmount\"" pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.373898 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-00cfcbaa-f21a-4033-a664-b79f20037aa9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-00cfcbaa-f21a-4033-a664-b79f20037aa9\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/61f743c243a2f3c874e32371bc0d645b7979f6933cd84645598bdd90c44dac60/globalmount\"" pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.375646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a416ad5-b3da-4bd9-949f-23485a7d2647-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.376524 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6665f4d3-5e86-4e4e-af41-9574adad9b2d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.378673 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.378706 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6d7e029-ccdc-4a78-8a00-a20bfbb50141\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6d7e029-ccdc-4a78-8a00-a20bfbb50141\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/85e2df9281d2e6ef71911faf9e3dcdb86103d7a008c1faec81f461b9b4dd175e/globalmount\"" pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.388110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64p6q\" (UniqueName: \"kubernetes.io/projected/3a416ad5-b3da-4bd9-949f-23485a7d2647-kube-api-access-64p6q\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.388580 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pjks\" (UniqueName: \"kubernetes.io/projected/30163aa0-30e4-4c0e-a703-47bb8a18bf07-kube-api-access-6pjks\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.398400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8687\" (UniqueName: \"kubernetes.io/projected/6665f4d3-5e86-4e4e-af41-9574adad9b2d-kube-api-access-b8687\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.416616 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-00cfcbaa-f21a-4033-a664-b79f20037aa9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-00cfcbaa-f21a-4033-a664-b79f20037aa9\") pod \"ovsdbserver-nb-2\" (UID: \"30163aa0-30e4-4c0e-a703-47bb8a18bf07\") " pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.420911 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e9d8bec9-2ad9-49f1-98f4-f8d865e49e49\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9d8bec9-2ad9-49f1-98f4-f8d865e49e49\") pod \"ovsdbserver-nb-1\" (UID: \"3a416ad5-b3da-4bd9-949f-23485a7d2647\") " pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.431950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6d7e029-ccdc-4a78-8a00-a20bfbb50141\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6d7e029-ccdc-4a78-8a00-a20bfbb50141\") pod \"ovsdbserver-nb-0\" (UID: \"6665f4d3-5e86-4e4e-af41-9574adad9b2d\") " pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.438809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.456347 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.519753 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.717849 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.811648 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 29 02:40:04 crc kubenswrapper[4749]: I1129 02:40:04.917493 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 29 02:40:05 crc kubenswrapper[4749]: I1129 02:40:05.026719 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 29 02:40:05 crc kubenswrapper[4749]: I1129 02:40:05.068683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"30163aa0-30e4-4c0e-a703-47bb8a18bf07","Type":"ContainerStarted","Data":"039a514ed6c2ca55b7ddf5889e90a9db3d639a66791a144c0ffd0995b2f5c6c0"} Nov 29 02:40:05 crc kubenswrapper[4749]: I1129 02:40:05.070121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"a4d50cec-a30f-49a1-857b-3181d5d1e632","Type":"ContainerStarted","Data":"24eb6ea2e0fdb36205b99f39818d663758705e2366b9eeebbc7c7c5e4b3aae1d"} Nov 29 02:40:05 crc kubenswrapper[4749]: I1129 02:40:05.071745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5a5702b1-caee-424f-b2ba-e62faf326574","Type":"ContainerStarted","Data":"e6c5c71a766da2f02eb5eb6f77bd2e1f43c81f8b42600cdf3685abf729e12ca3"} Nov 29 02:40:05 crc kubenswrapper[4749]: I1129 02:40:05.071773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5a5702b1-caee-424f-b2ba-e62faf326574","Type":"ContainerStarted","Data":"b35282c4c01a64febfc5aeadf3ecd3c2bb20b7405c24cf41b0e42a8fdd613b37"} Nov 29 02:40:05 crc kubenswrapper[4749]: I1129 02:40:05.113385 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 29 02:40:05 crc kubenswrapper[4749]: W1129 02:40:05.123363 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda12be827_924f_4ff4_8fba_c2e78d1222d0.slice/crio-12f03f4c9d16523f023a3fb2900d1b9d158af5a0dcf8a8b2c938503fe9e25643 WatchSource:0}: Error finding container 12f03f4c9d16523f023a3fb2900d1b9d158af5a0dcf8a8b2c938503fe9e25643: Status 404 returned error can't find the container with id 12f03f4c9d16523f023a3fb2900d1b9d158af5a0dcf8a8b2c938503fe9e25643 Nov 29 02:40:05 crc kubenswrapper[4749]: I1129 02:40:05.257758 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 29 02:40:05 crc kubenswrapper[4749]: I1129 02:40:05.711675 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 29 02:40:05 crc kubenswrapper[4749]: W1129 02:40:05.720879 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a416ad5_b3da_4bd9_949f_23485a7d2647.slice/crio-9512906b1c787fdbe700f997c9822a6d617129d332e46d54f1c1ed280c3ff210 WatchSource:0}: Error finding container 9512906b1c787fdbe700f997c9822a6d617129d332e46d54f1c1ed280c3ff210: Status 404 returned error can't find the container with id 9512906b1c787fdbe700f997c9822a6d617129d332e46d54f1c1ed280c3ff210 Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.080660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6665f4d3-5e86-4e4e-af41-9574adad9b2d","Type":"ContainerStarted","Data":"de77d713b52dca94cf40882a8ec1031e0b4d1b2d68e8c85601c1b6e809187145"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.080709 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6665f4d3-5e86-4e4e-af41-9574adad9b2d","Type":"ContainerStarted","Data":"ce662e3ea100f34c807eb2f32359601452d4c398061b22304edb2f6516767e0f"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.080721 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6665f4d3-5e86-4e4e-af41-9574adad9b2d","Type":"ContainerStarted","Data":"cf6a5c6b2c0b17ebe55fe88b3cfe9d5c5919f8035f83789f38e7626a0d32d7c4"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.083112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a12be827-924f-4ff4-8fba-c2e78d1222d0","Type":"ContainerStarted","Data":"e809fcfd2273c80dbddbf4cb30159b55750a2559e0cd9e135267c1180dffa73f"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.083139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a12be827-924f-4ff4-8fba-c2e78d1222d0","Type":"ContainerStarted","Data":"ea81cfa1f2a86953c638745cd4ee4cba0695cb30df66aa0ff19497350a2e3f45"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.083151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"a12be827-924f-4ff4-8fba-c2e78d1222d0","Type":"ContainerStarted","Data":"12f03f4c9d16523f023a3fb2900d1b9d158af5a0dcf8a8b2c938503fe9e25643"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.084969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5a5702b1-caee-424f-b2ba-e62faf326574","Type":"ContainerStarted","Data":"dfd27bc1b53dc30a0dac85cd2f6fc4b9495bb6dcebdfc0808c77ba85b43b0be0"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.087959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3a416ad5-b3da-4bd9-949f-23485a7d2647","Type":"ContainerStarted","Data":"420bf1fc21b67ae735f3bd6d2e074477698f0b765a447a0136b79af5ad82f2e3"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.088026 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3a416ad5-b3da-4bd9-949f-23485a7d2647","Type":"ContainerStarted","Data":"0152dfb10ca0a8b59d657cd8ae99d3266bfde8cecd8e51bee82910e7741d7175"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.088049 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3a416ad5-b3da-4bd9-949f-23485a7d2647","Type":"ContainerStarted","Data":"9512906b1c787fdbe700f997c9822a6d617129d332e46d54f1c1ed280c3ff210"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.089702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"30163aa0-30e4-4c0e-a703-47bb8a18bf07","Type":"ContainerStarted","Data":"da2757bdbe081910c55580833c7e375c6ca71cb316ac391a3f7e07133a1c8ab6"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.089746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"30163aa0-30e4-4c0e-a703-47bb8a18bf07","Type":"ContainerStarted","Data":"dc6ce33c8bfc71eb50ee5198ebb7c120b20562ec1c7aba729466245215ff5b03"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.091175 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"a4d50cec-a30f-49a1-857b-3181d5d1e632","Type":"ContainerStarted","Data":"023287b77a4b9af27afb0ad104ab4569860bcebcdc8b7fd1948beeb261e7a36a"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.091257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"a4d50cec-a30f-49a1-857b-3181d5d1e632","Type":"ContainerStarted","Data":"1e272a9c2cf5fa36c4cf9f58485a87bcf021a8aaf3d4a4b8fe1680618554fcd2"} Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.101406 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.101384113 podStartE2EDuration="3.101384113s" podCreationTimestamp="2025-11-29 02:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:06.100694706 +0000 UTC m=+5349.272844573" watchObservedRunningTime="2025-11-29 02:40:06.101384113 +0000 UTC m=+5349.273533980" Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.125116 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.125100239 podStartE2EDuration="4.125100239s" podCreationTimestamp="2025-11-29 02:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:06.121480811 +0000 UTC m=+5349.293630678" watchObservedRunningTime="2025-11-29 02:40:06.125100239 +0000 UTC m=+5349.297250096" Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.174349 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.174333066 podStartE2EDuration="4.174333066s" podCreationTimestamp="2025-11-29 02:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:06.170191695 +0000 UTC m=+5349.342341622" watchObservedRunningTime="2025-11-29 02:40:06.174333066 +0000 UTC m=+5349.346482933" Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.177087 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.177079113 podStartE2EDuration="3.177079113s" podCreationTimestamp="2025-11-29 02:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:06.151244025 +0000 UTC m=+5349.323393952" watchObservedRunningTime="2025-11-29 02:40:06.177079113 +0000 UTC m=+5349.349228970" Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.201253 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.20123098 podStartE2EDuration="4.20123098s" podCreationTimestamp="2025-11-29 02:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:06.191267698 +0000 UTC m=+5349.363417565" watchObservedRunningTime="2025-11-29 02:40:06.20123098 +0000 UTC m=+5349.373380877" Nov 29 02:40:06 crc kubenswrapper[4749]: I1129 02:40:06.241755 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.241729944 podStartE2EDuration="3.241729944s" podCreationTimestamp="2025-11-29 02:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:06.212545655 +0000 UTC m=+5349.384695512" watchObservedRunningTime="2025-11-29 02:40:06.241729944 +0000 UTC m=+5349.413879811" Nov 29 02:40:07 crc kubenswrapper[4749]: I1129 02:40:07.253682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:07 crc kubenswrapper[4749]: I1129 02:40:07.270487 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:07 crc kubenswrapper[4749]: I1129 02:40:07.439599 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:07 crc kubenswrapper[4749]: I1129 02:40:07.456810 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:07 crc kubenswrapper[4749]: I1129 02:40:07.520479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:07 crc kubenswrapper[4749]: I1129 02:40:07.718306 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:09 crc kubenswrapper[4749]: I1129 02:40:09.253497 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:09 crc kubenswrapper[4749]: I1129 02:40:09.269823 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:09 crc kubenswrapper[4749]: I1129 02:40:09.440110 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:09 crc kubenswrapper[4749]: I1129 02:40:09.457015 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:09 crc kubenswrapper[4749]: I1129 02:40:09.519886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:09 crc kubenswrapper[4749]: I1129 02:40:09.718407 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.355850 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.358742 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.416328 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.421036 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.506251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.534595 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.589863 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.596439 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.612667 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.631500 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57b59ccf65-jcttt"] Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.632908 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.635077 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.656279 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b59ccf65-jcttt"] Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.670118 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.756980 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.790492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z9d7\" (UniqueName: \"kubernetes.io/projected/2a2ed237-e46a-4e93-8f92-780df31a2300-kube-api-access-5z9d7\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.790598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-dns-svc\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.790665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-ovsdbserver-sb\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.790683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-config\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.820969 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.834177 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b59ccf65-jcttt"] Nov 29 02:40:10 crc kubenswrapper[4749]: E1129 02:40:10.834680 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-5z9d7 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" podUID="2a2ed237-e46a-4e93-8f92-780df31a2300" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.867977 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8557458f49-z25rs"] Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.869397 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.871636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.892113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z9d7\" (UniqueName: \"kubernetes.io/projected/2a2ed237-e46a-4e93-8f92-780df31a2300-kube-api-access-5z9d7\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.892231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-dns-svc\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.892277 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-ovsdbserver-sb\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.892293 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-config\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.893547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-dns-svc\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.894251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-ovsdbserver-sb\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.895013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-config\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.898471 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8557458f49-z25rs"] Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.912866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z9d7\" (UniqueName: \"kubernetes.io/projected/2a2ed237-e46a-4e93-8f92-780df31a2300-kube-api-access-5z9d7\") pod \"dnsmasq-dns-57b59ccf65-jcttt\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.993185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-sb\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.993288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-config\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.993311 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-nb\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.993383 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-dns-svc\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:10 crc kubenswrapper[4749]: I1129 02:40:10.993405 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8kpw\" (UniqueName: \"kubernetes.io/projected/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-kube-api-access-r8kpw\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.095693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-config\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.095782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-nb\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.095980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-dns-svc\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.096041 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8kpw\" (UniqueName: \"kubernetes.io/projected/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-kube-api-access-r8kpw\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.096133 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-sb\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.096930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-config\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.097151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-dns-svc\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.097471 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-nb\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.097721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-sb\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.116665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8kpw\" (UniqueName: \"kubernetes.io/projected/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-kube-api-access-r8kpw\") pod \"dnsmasq-dns-8557458f49-z25rs\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.140144 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.176648 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.184942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.299237 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-ovsdbserver-sb\") pod \"2a2ed237-e46a-4e93-8f92-780df31a2300\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.299657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a2ed237-e46a-4e93-8f92-780df31a2300" (UID: "2a2ed237-e46a-4e93-8f92-780df31a2300"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.299715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-config\") pod \"2a2ed237-e46a-4e93-8f92-780df31a2300\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.299823 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-dns-svc\") pod \"2a2ed237-e46a-4e93-8f92-780df31a2300\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.299881 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z9d7\" (UniqueName: \"kubernetes.io/projected/2a2ed237-e46a-4e93-8f92-780df31a2300-kube-api-access-5z9d7\") pod \"2a2ed237-e46a-4e93-8f92-780df31a2300\" (UID: \"2a2ed237-e46a-4e93-8f92-780df31a2300\") " Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.300647 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.301126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-config" (OuterVolumeSpecName: "config") pod "2a2ed237-e46a-4e93-8f92-780df31a2300" (UID: "2a2ed237-e46a-4e93-8f92-780df31a2300"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.301432 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a2ed237-e46a-4e93-8f92-780df31a2300" (UID: "2a2ed237-e46a-4e93-8f92-780df31a2300"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.306785 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a2ed237-e46a-4e93-8f92-780df31a2300-kube-api-access-5z9d7" (OuterVolumeSpecName: "kube-api-access-5z9d7") pod "2a2ed237-e46a-4e93-8f92-780df31a2300" (UID: "2a2ed237-e46a-4e93-8f92-780df31a2300"). InnerVolumeSpecName "kube-api-access-5z9d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.432945 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.432994 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z9d7\" (UniqueName: \"kubernetes.io/projected/2a2ed237-e46a-4e93-8f92-780df31a2300-kube-api-access-5z9d7\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.433010 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2ed237-e46a-4e93-8f92-780df31a2300-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:11 crc kubenswrapper[4749]: I1129 02:40:11.480319 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8557458f49-z25rs"] Nov 29 02:40:12 crc kubenswrapper[4749]: I1129 02:40:12.151243 4749 generic.go:334] "Generic (PLEG): container finished" podID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" containerID="c303996bef9b52e916bc3ddc08bb69ff1013824ddd651abe0cea0b511b8436e9" exitCode=0 Nov 29 02:40:12 crc kubenswrapper[4749]: I1129 02:40:12.151680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b59ccf65-jcttt" Nov 29 02:40:12 crc kubenswrapper[4749]: I1129 02:40:12.151383 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8557458f49-z25rs" event={"ID":"19e3fcc4-1774-442f-bbf0-4483c8c86eaa","Type":"ContainerDied","Data":"c303996bef9b52e916bc3ddc08bb69ff1013824ddd651abe0cea0b511b8436e9"} Nov 29 02:40:12 crc kubenswrapper[4749]: I1129 02:40:12.151902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8557458f49-z25rs" event={"ID":"19e3fcc4-1774-442f-bbf0-4483c8c86eaa","Type":"ContainerStarted","Data":"35303ff790456d31f80531cfa2d798e30d26d67cc62ec0d1093265a76b2073a2"} Nov 29 02:40:12 crc kubenswrapper[4749]: I1129 02:40:12.374238 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b59ccf65-jcttt"] Nov 29 02:40:12 crc kubenswrapper[4749]: I1129 02:40:12.381941 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57b59ccf65-jcttt"] Nov 29 02:40:13 crc kubenswrapper[4749]: I1129 02:40:13.095899 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a2ed237-e46a-4e93-8f92-780df31a2300" path="/var/lib/kubelet/pods/2a2ed237-e46a-4e93-8f92-780df31a2300/volumes" Nov 29 02:40:13 crc kubenswrapper[4749]: I1129 02:40:13.172599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8557458f49-z25rs" event={"ID":"19e3fcc4-1774-442f-bbf0-4483c8c86eaa","Type":"ContainerStarted","Data":"a69b848dea3f1c1e36fffcf38e28d5fccd53eb9e8122ff35c02a5e70e967c431"} Nov 29 02:40:13 crc kubenswrapper[4749]: I1129 02:40:13.172917 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:13 crc kubenswrapper[4749]: I1129 02:40:13.215519 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8557458f49-z25rs" podStartSLOduration=3.215493063 podStartE2EDuration="3.215493063s" podCreationTimestamp="2025-11-29 02:40:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:13.203696516 +0000 UTC m=+5356.375846413" watchObservedRunningTime="2025-11-29 02:40:13.215493063 +0000 UTC m=+5356.387642960" Nov 29 02:40:13 crc kubenswrapper[4749]: I1129 02:40:13.851944 4749 scope.go:117] "RemoveContainer" containerID="9a9ae982b8e5d2f60686f788fc9a7a37ed26a51af3e168cc560fe5d02a830bbf" Nov 29 02:40:13 crc kubenswrapper[4749]: I1129 02:40:13.880390 4749 scope.go:117] "RemoveContainer" containerID="009c5d59fe51ebe1a9aab29a36d065d2edb2cac00cac356c98ac15e1e518d118" Nov 29 02:40:13 crc kubenswrapper[4749]: I1129 02:40:13.954056 4749 scope.go:117] "RemoveContainer" containerID="e5b72ca19569b9e57e824122e8dafcd2a4bd05e1e4a554464ed33844c83d55ea" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.014294 4749 scope.go:117] "RemoveContainer" containerID="1f8aae235fd879794a524e3cc6c815a2e62a10bf87a95be4f90b947dbef4d72a" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.333109 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.334244 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.336980 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.346445 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.406788 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.406968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfmbv\" (UniqueName: \"kubernetes.io/projected/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-kube-api-access-jfmbv\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.407096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.508560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.508694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfmbv\" (UniqueName: \"kubernetes.io/projected/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-kube-api-access-jfmbv\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.508825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.512938 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.512987 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3b83e3595cf2095b088a3ce8c38ab82b410bce7c794d0d79d954391ac660b91a/globalmount\"" pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.516241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.533028 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfmbv\" (UniqueName: \"kubernetes.io/projected/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-kube-api-access-jfmbv\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.564272 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\") pod \"ovn-copy-data\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " pod="openstack/ovn-copy-data" Nov 29 02:40:14 crc kubenswrapper[4749]: I1129 02:40:14.672043 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 29 02:40:15 crc kubenswrapper[4749]: I1129 02:40:15.027837 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Nov 29 02:40:15 crc kubenswrapper[4749]: W1129 02:40:15.035611 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e1acf21_8f1e_49f7_83fe_7a6fe053a823.slice/crio-28f547c5689925416428c98c92ccc6a2b895472a0c581eb06362eff66b28540f WatchSource:0}: Error finding container 28f547c5689925416428c98c92ccc6a2b895472a0c581eb06362eff66b28540f: Status 404 returned error can't find the container with id 28f547c5689925416428c98c92ccc6a2b895472a0c581eb06362eff66b28540f Nov 29 02:40:15 crc kubenswrapper[4749]: I1129 02:40:15.204620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5e1acf21-8f1e-49f7-83fe-7a6fe053a823","Type":"ContainerStarted","Data":"28f547c5689925416428c98c92ccc6a2b895472a0c581eb06362eff66b28540f"} Nov 29 02:40:16 crc kubenswrapper[4749]: I1129 02:40:16.218278 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5e1acf21-8f1e-49f7-83fe-7a6fe053a823","Type":"ContainerStarted","Data":"50a096985df077833a10cea58e399e10a213947b479dd0a553b96283b1671ed8"} Nov 29 02:40:16 crc kubenswrapper[4749]: I1129 02:40:16.247028 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.247000244 podStartE2EDuration="3.247000244s" podCreationTimestamp="2025-11-29 02:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:16.237723369 +0000 UTC m=+5359.409873266" watchObservedRunningTime="2025-11-29 02:40:16.247000244 +0000 UTC m=+5359.419150141" Nov 29 02:40:18 crc kubenswrapper[4749]: E1129 02:40:18.885860 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:57922->38.102.83.30:35737: write tcp 38.102.83.30:57922->38.102.83.30:35737: write: connection reset by peer Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.186574 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.273559 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-88dhl"] Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.274277 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" podUID="9994b917-838f-4a83-9ef5-1d79cda66294" containerName="dnsmasq-dns" containerID="cri-o://e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3" gracePeriod=10 Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.715318 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.848548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-dns-svc\") pod \"9994b917-838f-4a83-9ef5-1d79cda66294\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.848717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr48c\" (UniqueName: \"kubernetes.io/projected/9994b917-838f-4a83-9ef5-1d79cda66294-kube-api-access-qr48c\") pod \"9994b917-838f-4a83-9ef5-1d79cda66294\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.848768 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-config\") pod \"9994b917-838f-4a83-9ef5-1d79cda66294\" (UID: \"9994b917-838f-4a83-9ef5-1d79cda66294\") " Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.860079 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9994b917-838f-4a83-9ef5-1d79cda66294-kube-api-access-qr48c" (OuterVolumeSpecName: "kube-api-access-qr48c") pod "9994b917-838f-4a83-9ef5-1d79cda66294" (UID: "9994b917-838f-4a83-9ef5-1d79cda66294"). InnerVolumeSpecName "kube-api-access-qr48c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.910070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9994b917-838f-4a83-9ef5-1d79cda66294" (UID: "9994b917-838f-4a83-9ef5-1d79cda66294"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.914872 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-config" (OuterVolumeSpecName: "config") pod "9994b917-838f-4a83-9ef5-1d79cda66294" (UID: "9994b917-838f-4a83-9ef5-1d79cda66294"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.950903 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr48c\" (UniqueName: \"kubernetes.io/projected/9994b917-838f-4a83-9ef5-1d79cda66294-kube-api-access-qr48c\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.950937 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:21 crc kubenswrapper[4749]: I1129 02:40:21.950950 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9994b917-838f-4a83-9ef5-1d79cda66294-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.290746 4749 generic.go:334] "Generic (PLEG): container finished" podID="9994b917-838f-4a83-9ef5-1d79cda66294" containerID="e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3" exitCode=0 Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.290821 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.290876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" event={"ID":"9994b917-838f-4a83-9ef5-1d79cda66294","Type":"ContainerDied","Data":"e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3"} Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.291980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-88dhl" event={"ID":"9994b917-838f-4a83-9ef5-1d79cda66294","Type":"ContainerDied","Data":"0ee0d67d7299e7fbfe9f1134a148120b9cb819c3b0e28602fac8f1c4365bdb02"} Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.292027 4749 scope.go:117] "RemoveContainer" containerID="e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.330080 4749 scope.go:117] "RemoveContainer" containerID="9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.355409 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-88dhl"] Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.371784 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-88dhl"] Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.392617 4749 scope.go:117] "RemoveContainer" containerID="e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3" Nov 29 02:40:22 crc kubenswrapper[4749]: E1129 02:40:22.393688 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3\": container with ID starting with e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3 not found: ID does not exist" containerID="e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.393777 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3"} err="failed to get container status \"e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3\": rpc error: code = NotFound desc = could not find container \"e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3\": container with ID starting with e282332ff44c24e728a50f144b9e64d969e053ca348dcee3802668b65a20dec3 not found: ID does not exist" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.393824 4749 scope.go:117] "RemoveContainer" containerID="9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730" Nov 29 02:40:22 crc kubenswrapper[4749]: E1129 02:40:22.394618 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730\": container with ID starting with 9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730 not found: ID does not exist" containerID="9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.394864 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730"} err="failed to get container status \"9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730\": rpc error: code = NotFound desc = could not find container \"9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730\": container with ID starting with 9f33d63f077dd04f9a9930290170e4a7c670e91f1799c9a56ec9dcc1a8bf7730 not found: ID does not exist" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.533289 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 29 02:40:22 crc kubenswrapper[4749]: E1129 02:40:22.533821 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9994b917-838f-4a83-9ef5-1d79cda66294" containerName="init" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.533843 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9994b917-838f-4a83-9ef5-1d79cda66294" containerName="init" Nov 29 02:40:22 crc kubenswrapper[4749]: E1129 02:40:22.533866 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9994b917-838f-4a83-9ef5-1d79cda66294" containerName="dnsmasq-dns" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.533880 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9994b917-838f-4a83-9ef5-1d79cda66294" containerName="dnsmasq-dns" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.534159 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9994b917-838f-4a83-9ef5-1d79cda66294" containerName="dnsmasq-dns" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.535687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.539838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-n6sw4" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.540009 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.541808 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.607632 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.663308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adc9ee4b-6d85-4100-8d10-64163bf250c0-config\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.663398 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adc9ee4b-6d85-4100-8d10-64163bf250c0-scripts\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.663422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/adc9ee4b-6d85-4100-8d10-64163bf250c0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.663443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adc9ee4b-6d85-4100-8d10-64163bf250c0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.663479 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7wwq\" (UniqueName: \"kubernetes.io/projected/adc9ee4b-6d85-4100-8d10-64163bf250c0-kube-api-access-z7wwq\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.764732 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adc9ee4b-6d85-4100-8d10-64163bf250c0-config\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.764846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adc9ee4b-6d85-4100-8d10-64163bf250c0-scripts\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.764879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/adc9ee4b-6d85-4100-8d10-64163bf250c0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.764920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adc9ee4b-6d85-4100-8d10-64163bf250c0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.764968 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7wwq\" (UniqueName: \"kubernetes.io/projected/adc9ee4b-6d85-4100-8d10-64163bf250c0-kube-api-access-z7wwq\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.765908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/adc9ee4b-6d85-4100-8d10-64163bf250c0-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.765915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adc9ee4b-6d85-4100-8d10-64163bf250c0-scripts\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.766140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adc9ee4b-6d85-4100-8d10-64163bf250c0-config\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.773219 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adc9ee4b-6d85-4100-8d10-64163bf250c0-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.785349 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7wwq\" (UniqueName: \"kubernetes.io/projected/adc9ee4b-6d85-4100-8d10-64163bf250c0-kube-api-access-z7wwq\") pod \"ovn-northd-0\" (UID: \"adc9ee4b-6d85-4100-8d10-64163bf250c0\") " pod="openstack/ovn-northd-0" Nov 29 02:40:22 crc kubenswrapper[4749]: I1129 02:40:22.866229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 29 02:40:23 crc kubenswrapper[4749]: I1129 02:40:23.100900 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9994b917-838f-4a83-9ef5-1d79cda66294" path="/var/lib/kubelet/pods/9994b917-838f-4a83-9ef5-1d79cda66294/volumes" Nov 29 02:40:23 crc kubenswrapper[4749]: W1129 02:40:23.381365 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadc9ee4b_6d85_4100_8d10_64163bf250c0.slice/crio-6d522dc18f6a6730d6f4d7c8cb737d879cd4734fd88f1415fd1092e3c97264d3 WatchSource:0}: Error finding container 6d522dc18f6a6730d6f4d7c8cb737d879cd4734fd88f1415fd1092e3c97264d3: Status 404 returned error can't find the container with id 6d522dc18f6a6730d6f4d7c8cb737d879cd4734fd88f1415fd1092e3c97264d3 Nov 29 02:40:23 crc kubenswrapper[4749]: I1129 02:40:23.381600 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 29 02:40:24 crc kubenswrapper[4749]: I1129 02:40:24.328240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"adc9ee4b-6d85-4100-8d10-64163bf250c0","Type":"ContainerStarted","Data":"aefc7131f42556025a586aa25b1190f273b0c0ca236702ae26e1b2241bbd1d67"} Nov 29 02:40:24 crc kubenswrapper[4749]: I1129 02:40:24.328704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"adc9ee4b-6d85-4100-8d10-64163bf250c0","Type":"ContainerStarted","Data":"c47c3edc1d643a07132ae58610cd7b75eab9faca2b9d53423d510f83204c93f6"} Nov 29 02:40:24 crc kubenswrapper[4749]: I1129 02:40:24.328735 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 29 02:40:24 crc kubenswrapper[4749]: I1129 02:40:24.328754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"adc9ee4b-6d85-4100-8d10-64163bf250c0","Type":"ContainerStarted","Data":"6d522dc18f6a6730d6f4d7c8cb737d879cd4734fd88f1415fd1092e3c97264d3"} Nov 29 02:40:24 crc kubenswrapper[4749]: I1129 02:40:24.358448 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.358418544 podStartE2EDuration="2.358418544s" podCreationTimestamp="2025-11-29 02:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:24.355044782 +0000 UTC m=+5367.527194729" watchObservedRunningTime="2025-11-29 02:40:24.358418544 +0000 UTC m=+5367.530568441" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.449280 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7lw9x"] Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.453014 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.460131 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7lw9x"] Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.555618 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8dd6-account-create-update-xf568"] Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.557271 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.559825 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.575697 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7adc6b-04e8-47a2-a570-c8e37a608860-operator-scripts\") pod \"keystone-db-create-7lw9x\" (UID: \"1e7adc6b-04e8-47a2-a570-c8e37a608860\") " pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.575900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5bx9\" (UniqueName: \"kubernetes.io/projected/1e7adc6b-04e8-47a2-a570-c8e37a608860-kube-api-access-g5bx9\") pod \"keystone-db-create-7lw9x\" (UID: \"1e7adc6b-04e8-47a2-a570-c8e37a608860\") " pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.578052 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8dd6-account-create-update-xf568"] Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.677298 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db62156-8d63-4864-a706-67ebdab6170c-operator-scripts\") pod \"keystone-8dd6-account-create-update-xf568\" (UID: \"8db62156-8d63-4864-a706-67ebdab6170c\") " pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.677357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5bx9\" (UniqueName: \"kubernetes.io/projected/1e7adc6b-04e8-47a2-a570-c8e37a608860-kube-api-access-g5bx9\") pod \"keystone-db-create-7lw9x\" (UID: \"1e7adc6b-04e8-47a2-a570-c8e37a608860\") " pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.677388 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7adc6b-04e8-47a2-a570-c8e37a608860-operator-scripts\") pod \"keystone-db-create-7lw9x\" (UID: \"1e7adc6b-04e8-47a2-a570-c8e37a608860\") " pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.677450 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4fr5\" (UniqueName: \"kubernetes.io/projected/8db62156-8d63-4864-a706-67ebdab6170c-kube-api-access-f4fr5\") pod \"keystone-8dd6-account-create-update-xf568\" (UID: \"8db62156-8d63-4864-a706-67ebdab6170c\") " pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.678375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7adc6b-04e8-47a2-a570-c8e37a608860-operator-scripts\") pod \"keystone-db-create-7lw9x\" (UID: \"1e7adc6b-04e8-47a2-a570-c8e37a608860\") " pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.698584 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5bx9\" (UniqueName: \"kubernetes.io/projected/1e7adc6b-04e8-47a2-a570-c8e37a608860-kube-api-access-g5bx9\") pod \"keystone-db-create-7lw9x\" (UID: \"1e7adc6b-04e8-47a2-a570-c8e37a608860\") " pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.775008 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.779443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4fr5\" (UniqueName: \"kubernetes.io/projected/8db62156-8d63-4864-a706-67ebdab6170c-kube-api-access-f4fr5\") pod \"keystone-8dd6-account-create-update-xf568\" (UID: \"8db62156-8d63-4864-a706-67ebdab6170c\") " pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.779596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db62156-8d63-4864-a706-67ebdab6170c-operator-scripts\") pod \"keystone-8dd6-account-create-update-xf568\" (UID: \"8db62156-8d63-4864-a706-67ebdab6170c\") " pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.780826 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db62156-8d63-4864-a706-67ebdab6170c-operator-scripts\") pod \"keystone-8dd6-account-create-update-xf568\" (UID: \"8db62156-8d63-4864-a706-67ebdab6170c\") " pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.817135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4fr5\" (UniqueName: \"kubernetes.io/projected/8db62156-8d63-4864-a706-67ebdab6170c-kube-api-access-f4fr5\") pod \"keystone-8dd6-account-create-update-xf568\" (UID: \"8db62156-8d63-4864-a706-67ebdab6170c\") " pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:28 crc kubenswrapper[4749]: I1129 02:40:28.877362 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:29 crc kubenswrapper[4749]: W1129 02:40:29.137699 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e7adc6b_04e8_47a2_a570_c8e37a608860.slice/crio-f6b73156184539fe73c058694d50d88101532c61dcf635a52db3428b06c25c04 WatchSource:0}: Error finding container f6b73156184539fe73c058694d50d88101532c61dcf635a52db3428b06c25c04: Status 404 returned error can't find the container with id f6b73156184539fe73c058694d50d88101532c61dcf635a52db3428b06c25c04 Nov 29 02:40:29 crc kubenswrapper[4749]: I1129 02:40:29.140073 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7lw9x"] Nov 29 02:40:29 crc kubenswrapper[4749]: I1129 02:40:29.376077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7lw9x" event={"ID":"1e7adc6b-04e8-47a2-a570-c8e37a608860","Type":"ContainerStarted","Data":"65f1a59e3ba03494e4d214bee39f739421180a69941f6d32de304948ea5cef79"} Nov 29 02:40:29 crc kubenswrapper[4749]: I1129 02:40:29.376384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7lw9x" event={"ID":"1e7adc6b-04e8-47a2-a570-c8e37a608860","Type":"ContainerStarted","Data":"f6b73156184539fe73c058694d50d88101532c61dcf635a52db3428b06c25c04"} Nov 29 02:40:29 crc kubenswrapper[4749]: I1129 02:40:29.393809 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-7lw9x" podStartSLOduration=1.393788879 podStartE2EDuration="1.393788879s" podCreationTimestamp="2025-11-29 02:40:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:29.391600276 +0000 UTC m=+5372.563750143" watchObservedRunningTime="2025-11-29 02:40:29.393788879 +0000 UTC m=+5372.565938746" Nov 29 02:40:29 crc kubenswrapper[4749]: W1129 02:40:29.420345 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8db62156_8d63_4864_a706_67ebdab6170c.slice/crio-91a7c1d8746c43de804597fae564a00849e3a84f1c67e81f8d3c4cc93751aeaa WatchSource:0}: Error finding container 91a7c1d8746c43de804597fae564a00849e3a84f1c67e81f8d3c4cc93751aeaa: Status 404 returned error can't find the container with id 91a7c1d8746c43de804597fae564a00849e3a84f1c67e81f8d3c4cc93751aeaa Nov 29 02:40:29 crc kubenswrapper[4749]: I1129 02:40:29.420398 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8dd6-account-create-update-xf568"] Nov 29 02:40:30 crc kubenswrapper[4749]: I1129 02:40:30.437814 4749 generic.go:334] "Generic (PLEG): container finished" podID="8db62156-8d63-4864-a706-67ebdab6170c" containerID="f82e00b9bde96e9dff121bf21813ccceda8ddd947f5184c7e247a3e5dddc7c1e" exitCode=0 Nov 29 02:40:30 crc kubenswrapper[4749]: I1129 02:40:30.437921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8dd6-account-create-update-xf568" event={"ID":"8db62156-8d63-4864-a706-67ebdab6170c","Type":"ContainerDied","Data":"f82e00b9bde96e9dff121bf21813ccceda8ddd947f5184c7e247a3e5dddc7c1e"} Nov 29 02:40:30 crc kubenswrapper[4749]: I1129 02:40:30.438173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8dd6-account-create-update-xf568" event={"ID":"8db62156-8d63-4864-a706-67ebdab6170c","Type":"ContainerStarted","Data":"91a7c1d8746c43de804597fae564a00849e3a84f1c67e81f8d3c4cc93751aeaa"} Nov 29 02:40:30 crc kubenswrapper[4749]: I1129 02:40:30.440889 4749 generic.go:334] "Generic (PLEG): container finished" podID="1e7adc6b-04e8-47a2-a570-c8e37a608860" containerID="65f1a59e3ba03494e4d214bee39f739421180a69941f6d32de304948ea5cef79" exitCode=0 Nov 29 02:40:30 crc kubenswrapper[4749]: I1129 02:40:30.440970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7lw9x" event={"ID":"1e7adc6b-04e8-47a2-a570-c8e37a608860","Type":"ContainerDied","Data":"65f1a59e3ba03494e4d214bee39f739421180a69941f6d32de304948ea5cef79"} Nov 29 02:40:31 crc kubenswrapper[4749]: I1129 02:40:31.926479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:31 crc kubenswrapper[4749]: I1129 02:40:31.934157 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.060264 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7adc6b-04e8-47a2-a570-c8e37a608860-operator-scripts\") pod \"1e7adc6b-04e8-47a2-a570-c8e37a608860\" (UID: \"1e7adc6b-04e8-47a2-a570-c8e37a608860\") " Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.060467 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4fr5\" (UniqueName: \"kubernetes.io/projected/8db62156-8d63-4864-a706-67ebdab6170c-kube-api-access-f4fr5\") pod \"8db62156-8d63-4864-a706-67ebdab6170c\" (UID: \"8db62156-8d63-4864-a706-67ebdab6170c\") " Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.060570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5bx9\" (UniqueName: \"kubernetes.io/projected/1e7adc6b-04e8-47a2-a570-c8e37a608860-kube-api-access-g5bx9\") pod \"1e7adc6b-04e8-47a2-a570-c8e37a608860\" (UID: \"1e7adc6b-04e8-47a2-a570-c8e37a608860\") " Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.060647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db62156-8d63-4864-a706-67ebdab6170c-operator-scripts\") pod \"8db62156-8d63-4864-a706-67ebdab6170c\" (UID: \"8db62156-8d63-4864-a706-67ebdab6170c\") " Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.061032 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e7adc6b-04e8-47a2-a570-c8e37a608860-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e7adc6b-04e8-47a2-a570-c8e37a608860" (UID: "1e7adc6b-04e8-47a2-a570-c8e37a608860"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.061306 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7adc6b-04e8-47a2-a570-c8e37a608860-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.061658 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db62156-8d63-4864-a706-67ebdab6170c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8db62156-8d63-4864-a706-67ebdab6170c" (UID: "8db62156-8d63-4864-a706-67ebdab6170c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.065723 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db62156-8d63-4864-a706-67ebdab6170c-kube-api-access-f4fr5" (OuterVolumeSpecName: "kube-api-access-f4fr5") pod "8db62156-8d63-4864-a706-67ebdab6170c" (UID: "8db62156-8d63-4864-a706-67ebdab6170c"). InnerVolumeSpecName "kube-api-access-f4fr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.067698 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7adc6b-04e8-47a2-a570-c8e37a608860-kube-api-access-g5bx9" (OuterVolumeSpecName: "kube-api-access-g5bx9") pod "1e7adc6b-04e8-47a2-a570-c8e37a608860" (UID: "1e7adc6b-04e8-47a2-a570-c8e37a608860"). InnerVolumeSpecName "kube-api-access-g5bx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.162980 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4fr5\" (UniqueName: \"kubernetes.io/projected/8db62156-8d63-4864-a706-67ebdab6170c-kube-api-access-f4fr5\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.163027 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5bx9\" (UniqueName: \"kubernetes.io/projected/1e7adc6b-04e8-47a2-a570-c8e37a608860-kube-api-access-g5bx9\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.163041 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8db62156-8d63-4864-a706-67ebdab6170c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.468422 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7lw9x" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.468426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7lw9x" event={"ID":"1e7adc6b-04e8-47a2-a570-c8e37a608860","Type":"ContainerDied","Data":"f6b73156184539fe73c058694d50d88101532c61dcf635a52db3428b06c25c04"} Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.468631 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6b73156184539fe73c058694d50d88101532c61dcf635a52db3428b06c25c04" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.471043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8dd6-account-create-update-xf568" event={"ID":"8db62156-8d63-4864-a706-67ebdab6170c","Type":"ContainerDied","Data":"91a7c1d8746c43de804597fae564a00849e3a84f1c67e81f8d3c4cc93751aeaa"} Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.471287 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91a7c1d8746c43de804597fae564a00849e3a84f1c67e81f8d3c4cc93751aeaa" Nov 29 02:40:32 crc kubenswrapper[4749]: I1129 02:40:32.471097 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8dd6-account-create-update-xf568" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.165382 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-c88sj"] Nov 29 02:40:34 crc kubenswrapper[4749]: E1129 02:40:34.166229 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7adc6b-04e8-47a2-a570-c8e37a608860" containerName="mariadb-database-create" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.166250 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7adc6b-04e8-47a2-a570-c8e37a608860" containerName="mariadb-database-create" Nov 29 02:40:34 crc kubenswrapper[4749]: E1129 02:40:34.166274 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db62156-8d63-4864-a706-67ebdab6170c" containerName="mariadb-account-create-update" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.166287 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db62156-8d63-4864-a706-67ebdab6170c" containerName="mariadb-account-create-update" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.166539 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7adc6b-04e8-47a2-a570-c8e37a608860" containerName="mariadb-database-create" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.166576 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db62156-8d63-4864-a706-67ebdab6170c" containerName="mariadb-account-create-update" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.167440 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.171524 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wzt9m" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.172820 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.173336 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.173735 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c88sj"] Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.173761 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.301716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-combined-ca-bundle\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.301798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlphz\" (UniqueName: \"kubernetes.io/projected/31a77337-373b-4b13-93d4-8f338b35f511-kube-api-access-dlphz\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.301850 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-config-data\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.403517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-combined-ca-bundle\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.403560 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlphz\" (UniqueName: \"kubernetes.io/projected/31a77337-373b-4b13-93d4-8f338b35f511-kube-api-access-dlphz\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.403585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-config-data\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.411674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-config-data\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.420000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-combined-ca-bundle\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.425243 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlphz\" (UniqueName: \"kubernetes.io/projected/31a77337-373b-4b13-93d4-8f338b35f511-kube-api-access-dlphz\") pod \"keystone-db-sync-c88sj\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.497303 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:34 crc kubenswrapper[4749]: I1129 02:40:34.993753 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c88sj"] Nov 29 02:40:35 crc kubenswrapper[4749]: I1129 02:40:35.507903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c88sj" event={"ID":"31a77337-373b-4b13-93d4-8f338b35f511","Type":"ContainerStarted","Data":"7f54f9502806954614081ae49fbf1cc5873f5880151030cef00f6f3b5559f2fd"} Nov 29 02:40:35 crc kubenswrapper[4749]: I1129 02:40:35.508372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c88sj" event={"ID":"31a77337-373b-4b13-93d4-8f338b35f511","Type":"ContainerStarted","Data":"d9ea414c16a5958496525e5ba50b372b23a5556eb83605af8e6dbb4328a73ada"} Nov 29 02:40:35 crc kubenswrapper[4749]: I1129 02:40:35.526147 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-c88sj" podStartSLOduration=1.526125457 podStartE2EDuration="1.526125457s" podCreationTimestamp="2025-11-29 02:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:35.522801096 +0000 UTC m=+5378.694951003" watchObservedRunningTime="2025-11-29 02:40:35.526125457 +0000 UTC m=+5378.698275324" Nov 29 02:40:37 crc kubenswrapper[4749]: I1129 02:40:37.530575 4749 generic.go:334] "Generic (PLEG): container finished" podID="31a77337-373b-4b13-93d4-8f338b35f511" containerID="7f54f9502806954614081ae49fbf1cc5873f5880151030cef00f6f3b5559f2fd" exitCode=0 Nov 29 02:40:37 crc kubenswrapper[4749]: I1129 02:40:37.530690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c88sj" event={"ID":"31a77337-373b-4b13-93d4-8f338b35f511","Type":"ContainerDied","Data":"7f54f9502806954614081ae49fbf1cc5873f5880151030cef00f6f3b5559f2fd"} Nov 29 02:40:37 crc kubenswrapper[4749]: I1129 02:40:37.967569 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 29 02:40:38 crc kubenswrapper[4749]: I1129 02:40:38.943980 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.104139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-combined-ca-bundle\") pod \"31a77337-373b-4b13-93d4-8f338b35f511\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.104755 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlphz\" (UniqueName: \"kubernetes.io/projected/31a77337-373b-4b13-93d4-8f338b35f511-kube-api-access-dlphz\") pod \"31a77337-373b-4b13-93d4-8f338b35f511\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.104851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-config-data\") pod \"31a77337-373b-4b13-93d4-8f338b35f511\" (UID: \"31a77337-373b-4b13-93d4-8f338b35f511\") " Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.130859 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a77337-373b-4b13-93d4-8f338b35f511-kube-api-access-dlphz" (OuterVolumeSpecName: "kube-api-access-dlphz") pod "31a77337-373b-4b13-93d4-8f338b35f511" (UID: "31a77337-373b-4b13-93d4-8f338b35f511"). InnerVolumeSpecName "kube-api-access-dlphz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.161988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31a77337-373b-4b13-93d4-8f338b35f511" (UID: "31a77337-373b-4b13-93d4-8f338b35f511"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.167853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-config-data" (OuterVolumeSpecName: "config-data") pod "31a77337-373b-4b13-93d4-8f338b35f511" (UID: "31a77337-373b-4b13-93d4-8f338b35f511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.224008 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.224052 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlphz\" (UniqueName: \"kubernetes.io/projected/31a77337-373b-4b13-93d4-8f338b35f511-kube-api-access-dlphz\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.224072 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a77337-373b-4b13-93d4-8f338b35f511-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.548947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c88sj" event={"ID":"31a77337-373b-4b13-93d4-8f338b35f511","Type":"ContainerDied","Data":"d9ea414c16a5958496525e5ba50b372b23a5556eb83605af8e6dbb4328a73ada"} Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.548985 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9ea414c16a5958496525e5ba50b372b23a5556eb83605af8e6dbb4328a73ada" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.549082 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c88sj" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.832381 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869b8c7567-wnls6"] Nov 29 02:40:39 crc kubenswrapper[4749]: E1129 02:40:39.832981 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a77337-373b-4b13-93d4-8f338b35f511" containerName="keystone-db-sync" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.833000 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a77337-373b-4b13-93d4-8f338b35f511" containerName="keystone-db-sync" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.833562 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a77337-373b-4b13-93d4-8f338b35f511" containerName="keystone-db-sync" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.834619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.857292 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jtvtd"] Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.859003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.863719 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.863772 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wzt9m" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.863920 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.863718 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.864162 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.870015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869b8c7567-wnls6"] Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.880055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jtvtd"] Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.936889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-config\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.936950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rw2d\" (UniqueName: \"kubernetes.io/projected/5e61d941-9458-4dd0-9468-f977b67c922c-kube-api-access-8rw2d\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.937134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-nb\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.937242 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-dns-svc\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:39 crc kubenswrapper[4749]: I1129 02:40:39.937387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-sb\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.038491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-sb\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.038546 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-config-data\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.038597 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xczv\" (UniqueName: \"kubernetes.io/projected/8770b377-e276-40d6-a325-9fb7b96e94cc-kube-api-access-2xczv\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.038636 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-config\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.038757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rw2d\" (UniqueName: \"kubernetes.io/projected/5e61d941-9458-4dd0-9468-f977b67c922c-kube-api-access-8rw2d\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.038877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-combined-ca-bundle\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.038920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-fernet-keys\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.039023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-nb\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.039055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-scripts\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.039124 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-dns-svc\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.039257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-credential-keys\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.039571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-sb\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.039721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-config\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.039734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-dns-svc\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.040015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-nb\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.060140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rw2d\" (UniqueName: \"kubernetes.io/projected/5e61d941-9458-4dd0-9468-f977b67c922c-kube-api-access-8rw2d\") pod \"dnsmasq-dns-869b8c7567-wnls6\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.140538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-credential-keys\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.140622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-config-data\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.140641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xczv\" (UniqueName: \"kubernetes.io/projected/8770b377-e276-40d6-a325-9fb7b96e94cc-kube-api-access-2xczv\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.140689 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-combined-ca-bundle\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.140708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-fernet-keys\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.140737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-scripts\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.143894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-credential-keys\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.144278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-scripts\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.144476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-combined-ca-bundle\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.144600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-fernet-keys\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.145896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-config-data\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.161708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xczv\" (UniqueName: \"kubernetes.io/projected/8770b377-e276-40d6-a325-9fb7b96e94cc-kube-api-access-2xczv\") pod \"keystone-bootstrap-jtvtd\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.165788 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.183968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.652306 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jtvtd"] Nov 29 02:40:40 crc kubenswrapper[4749]: W1129 02:40:40.744035 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e61d941_9458_4dd0_9468_f977b67c922c.slice/crio-2dd70d5a3250979e0a56098bd239e2ac955b5221139da16cb4afa3256fe96e32 WatchSource:0}: Error finding container 2dd70d5a3250979e0a56098bd239e2ac955b5221139da16cb4afa3256fe96e32: Status 404 returned error can't find the container with id 2dd70d5a3250979e0a56098bd239e2ac955b5221139da16cb4afa3256fe96e32 Nov 29 02:40:40 crc kubenswrapper[4749]: I1129 02:40:40.746051 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869b8c7567-wnls6"] Nov 29 02:40:41 crc kubenswrapper[4749]: I1129 02:40:41.570807 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e61d941-9458-4dd0-9468-f977b67c922c" containerID="ff9b182d95f581343ae1b70666dca164862b676c7735185c82be771074d2c164" exitCode=0 Nov 29 02:40:41 crc kubenswrapper[4749]: I1129 02:40:41.570993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" event={"ID":"5e61d941-9458-4dd0-9468-f977b67c922c","Type":"ContainerDied","Data":"ff9b182d95f581343ae1b70666dca164862b676c7735185c82be771074d2c164"} Nov 29 02:40:41 crc kubenswrapper[4749]: I1129 02:40:41.571069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" event={"ID":"5e61d941-9458-4dd0-9468-f977b67c922c","Type":"ContainerStarted","Data":"2dd70d5a3250979e0a56098bd239e2ac955b5221139da16cb4afa3256fe96e32"} Nov 29 02:40:41 crc kubenswrapper[4749]: I1129 02:40:41.573394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jtvtd" event={"ID":"8770b377-e276-40d6-a325-9fb7b96e94cc","Type":"ContainerStarted","Data":"2553c087b805afe93c9582071d4686e8977af0232ee379015f8df1a9a6ebe9d6"} Nov 29 02:40:41 crc kubenswrapper[4749]: I1129 02:40:41.573441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jtvtd" event={"ID":"8770b377-e276-40d6-a325-9fb7b96e94cc","Type":"ContainerStarted","Data":"55855ca97b136b5bfb0fe75768f50a968260965935ae67074f9486569e6de8c5"} Nov 29 02:40:41 crc kubenswrapper[4749]: I1129 02:40:41.633855 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jtvtd" podStartSLOduration=2.633826505 podStartE2EDuration="2.633826505s" podCreationTimestamp="2025-11-29 02:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:41.631387366 +0000 UTC m=+5384.803537233" watchObservedRunningTime="2025-11-29 02:40:41.633826505 +0000 UTC m=+5384.805976392" Nov 29 02:40:42 crc kubenswrapper[4749]: I1129 02:40:42.590489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" event={"ID":"5e61d941-9458-4dd0-9468-f977b67c922c","Type":"ContainerStarted","Data":"bad4e54e311d51a434c7888485ddf4daa06a067b8c79f44fcaaf65859bc7c861"} Nov 29 02:40:42 crc kubenswrapper[4749]: I1129 02:40:42.590614 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:42 crc kubenswrapper[4749]: I1129 02:40:42.625514 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" podStartSLOduration=3.625480628 podStartE2EDuration="3.625480628s" podCreationTimestamp="2025-11-29 02:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:42.623386777 +0000 UTC m=+5385.795536704" watchObservedRunningTime="2025-11-29 02:40:42.625480628 +0000 UTC m=+5385.797630555" Nov 29 02:40:44 crc kubenswrapper[4749]: I1129 02:40:44.610930 4749 generic.go:334] "Generic (PLEG): container finished" podID="8770b377-e276-40d6-a325-9fb7b96e94cc" containerID="2553c087b805afe93c9582071d4686e8977af0232ee379015f8df1a9a6ebe9d6" exitCode=0 Nov 29 02:40:44 crc kubenswrapper[4749]: I1129 02:40:44.611002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jtvtd" event={"ID":"8770b377-e276-40d6-a325-9fb7b96e94cc","Type":"ContainerDied","Data":"2553c087b805afe93c9582071d4686e8977af0232ee379015f8df1a9a6ebe9d6"} Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.025829 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.171074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-credential-keys\") pod \"8770b377-e276-40d6-a325-9fb7b96e94cc\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.171148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xczv\" (UniqueName: \"kubernetes.io/projected/8770b377-e276-40d6-a325-9fb7b96e94cc-kube-api-access-2xczv\") pod \"8770b377-e276-40d6-a325-9fb7b96e94cc\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.171298 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-fernet-keys\") pod \"8770b377-e276-40d6-a325-9fb7b96e94cc\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.171352 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-scripts\") pod \"8770b377-e276-40d6-a325-9fb7b96e94cc\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.171407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-combined-ca-bundle\") pod \"8770b377-e276-40d6-a325-9fb7b96e94cc\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.171436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-config-data\") pod \"8770b377-e276-40d6-a325-9fb7b96e94cc\" (UID: \"8770b377-e276-40d6-a325-9fb7b96e94cc\") " Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.179712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8770b377-e276-40d6-a325-9fb7b96e94cc" (UID: "8770b377-e276-40d6-a325-9fb7b96e94cc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.179794 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-scripts" (OuterVolumeSpecName: "scripts") pod "8770b377-e276-40d6-a325-9fb7b96e94cc" (UID: "8770b377-e276-40d6-a325-9fb7b96e94cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.181328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8770b377-e276-40d6-a325-9fb7b96e94cc-kube-api-access-2xczv" (OuterVolumeSpecName: "kube-api-access-2xczv") pod "8770b377-e276-40d6-a325-9fb7b96e94cc" (UID: "8770b377-e276-40d6-a325-9fb7b96e94cc"). InnerVolumeSpecName "kube-api-access-2xczv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.181430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8770b377-e276-40d6-a325-9fb7b96e94cc" (UID: "8770b377-e276-40d6-a325-9fb7b96e94cc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.211951 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8770b377-e276-40d6-a325-9fb7b96e94cc" (UID: "8770b377-e276-40d6-a325-9fb7b96e94cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.215945 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-config-data" (OuterVolumeSpecName: "config-data") pod "8770b377-e276-40d6-a325-9fb7b96e94cc" (UID: "8770b377-e276-40d6-a325-9fb7b96e94cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.274845 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.274874 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.274894 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.274936 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.274947 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8770b377-e276-40d6-a325-9fb7b96e94cc-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.274957 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xczv\" (UniqueName: \"kubernetes.io/projected/8770b377-e276-40d6-a325-9fb7b96e94cc-kube-api-access-2xczv\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.630082 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jtvtd" event={"ID":"8770b377-e276-40d6-a325-9fb7b96e94cc","Type":"ContainerDied","Data":"55855ca97b136b5bfb0fe75768f50a968260965935ae67074f9486569e6de8c5"} Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.630119 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55855ca97b136b5bfb0fe75768f50a968260965935ae67074f9486569e6de8c5" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.630468 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jtvtd" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.705722 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jtvtd"] Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.713119 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jtvtd"] Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.811352 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-27gd8"] Nov 29 02:40:46 crc kubenswrapper[4749]: E1129 02:40:46.811750 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8770b377-e276-40d6-a325-9fb7b96e94cc" containerName="keystone-bootstrap" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.811770 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8770b377-e276-40d6-a325-9fb7b96e94cc" containerName="keystone-bootstrap" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.811965 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8770b377-e276-40d6-a325-9fb7b96e94cc" containerName="keystone-bootstrap" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.812616 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.815429 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.815650 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.817466 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.818029 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.818269 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wzt9m" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.829518 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-27gd8"] Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.985978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-fernet-keys\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.986023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-combined-ca-bundle\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.986050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-config-data\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.986079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6xtl\" (UniqueName: \"kubernetes.io/projected/5c4a187b-e26a-45b8-83d7-a9507f71ac24-kube-api-access-d6xtl\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.986288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-credential-keys\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:46 crc kubenswrapper[4749]: I1129 02:40:46.986407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-scripts\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.089107 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-credential-keys\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.089282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-scripts\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.089474 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-fernet-keys\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.089546 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-combined-ca-bundle\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.089607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-config-data\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.089677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6xtl\" (UniqueName: \"kubernetes.io/projected/5c4a187b-e26a-45b8-83d7-a9507f71ac24-kube-api-access-d6xtl\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.091082 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8770b377-e276-40d6-a325-9fb7b96e94cc" path="/var/lib/kubelet/pods/8770b377-e276-40d6-a325-9fb7b96e94cc/volumes" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.093134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-scripts\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.093463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-fernet-keys\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.093678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-combined-ca-bundle\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.095877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-config-data\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.100717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-credential-keys\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.123100 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6xtl\" (UniqueName: \"kubernetes.io/projected/5c4a187b-e26a-45b8-83d7-a9507f71ac24-kube-api-access-d6xtl\") pod \"keystone-bootstrap-27gd8\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.127913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:47 crc kubenswrapper[4749]: I1129 02:40:47.635656 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-27gd8"] Nov 29 02:40:48 crc kubenswrapper[4749]: I1129 02:40:48.654357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-27gd8" event={"ID":"5c4a187b-e26a-45b8-83d7-a9507f71ac24","Type":"ContainerStarted","Data":"f7c0f70b0063879c18dfaa90a5eabb8b3948594c314efa818996c2b7721c34bb"} Nov 29 02:40:48 crc kubenswrapper[4749]: I1129 02:40:48.654732 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-27gd8" event={"ID":"5c4a187b-e26a-45b8-83d7-a9507f71ac24","Type":"ContainerStarted","Data":"d8b11c5cbac2c04dce6a891bda686636d937b33fadbd2d2f28018fb85fcee3fc"} Nov 29 02:40:48 crc kubenswrapper[4749]: I1129 02:40:48.689508 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-27gd8" podStartSLOduration=2.689481314 podStartE2EDuration="2.689481314s" podCreationTimestamp="2025-11-29 02:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:48.67982924 +0000 UTC m=+5391.851979097" watchObservedRunningTime="2025-11-29 02:40:48.689481314 +0000 UTC m=+5391.861631211" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.168028 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.242378 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8557458f49-z25rs"] Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.242649 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8557458f49-z25rs" podUID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" containerName="dnsmasq-dns" containerID="cri-o://a69b848dea3f1c1e36fffcf38e28d5fccd53eb9e8122ff35c02a5e70e967c431" gracePeriod=10 Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.673887 4749 generic.go:334] "Generic (PLEG): container finished" podID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" containerID="a69b848dea3f1c1e36fffcf38e28d5fccd53eb9e8122ff35c02a5e70e967c431" exitCode=0 Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.673982 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8557458f49-z25rs" event={"ID":"19e3fcc4-1774-442f-bbf0-4483c8c86eaa","Type":"ContainerDied","Data":"a69b848dea3f1c1e36fffcf38e28d5fccd53eb9e8122ff35c02a5e70e967c431"} Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.674017 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8557458f49-z25rs" event={"ID":"19e3fcc4-1774-442f-bbf0-4483c8c86eaa","Type":"ContainerDied","Data":"35303ff790456d31f80531cfa2d798e30d26d67cc62ec0d1093265a76b2073a2"} Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.674058 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35303ff790456d31f80531cfa2d798e30d26d67cc62ec0d1093265a76b2073a2" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.675612 4749 generic.go:334] "Generic (PLEG): container finished" podID="5c4a187b-e26a-45b8-83d7-a9507f71ac24" containerID="f7c0f70b0063879c18dfaa90a5eabb8b3948594c314efa818996c2b7721c34bb" exitCode=0 Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.675663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-27gd8" event={"ID":"5c4a187b-e26a-45b8-83d7-a9507f71ac24","Type":"ContainerDied","Data":"f7c0f70b0063879c18dfaa90a5eabb8b3948594c314efa818996c2b7721c34bb"} Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.715483 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.779000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-dns-svc\") pod \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.779171 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-config\") pod \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.779222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8kpw\" (UniqueName: \"kubernetes.io/projected/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-kube-api-access-r8kpw\") pod \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.779265 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-nb\") pod \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.779289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-sb\") pod \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\" (UID: \"19e3fcc4-1774-442f-bbf0-4483c8c86eaa\") " Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.784642 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-kube-api-access-r8kpw" (OuterVolumeSpecName: "kube-api-access-r8kpw") pod "19e3fcc4-1774-442f-bbf0-4483c8c86eaa" (UID: "19e3fcc4-1774-442f-bbf0-4483c8c86eaa"). InnerVolumeSpecName "kube-api-access-r8kpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.816130 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-config" (OuterVolumeSpecName: "config") pod "19e3fcc4-1774-442f-bbf0-4483c8c86eaa" (UID: "19e3fcc4-1774-442f-bbf0-4483c8c86eaa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.833898 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19e3fcc4-1774-442f-bbf0-4483c8c86eaa" (UID: "19e3fcc4-1774-442f-bbf0-4483c8c86eaa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.835181 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19e3fcc4-1774-442f-bbf0-4483c8c86eaa" (UID: "19e3fcc4-1774-442f-bbf0-4483c8c86eaa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.845846 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19e3fcc4-1774-442f-bbf0-4483c8c86eaa" (UID: "19e3fcc4-1774-442f-bbf0-4483c8c86eaa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.880552 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.880583 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.880593 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8kpw\" (UniqueName: \"kubernetes.io/projected/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-kube-api-access-r8kpw\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.880603 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:50 crc kubenswrapper[4749]: I1129 02:40:50.880613 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19e3fcc4-1774-442f-bbf0-4483c8c86eaa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.258634 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nnpvr"] Nov 29 02:40:51 crc kubenswrapper[4749]: E1129 02:40:51.259297 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" containerName="init" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.259325 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" containerName="init" Nov 29 02:40:51 crc kubenswrapper[4749]: E1129 02:40:51.259381 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" containerName="dnsmasq-dns" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.259399 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" containerName="dnsmasq-dns" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.259707 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" containerName="dnsmasq-dns" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.261971 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.282066 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnpvr"] Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.288971 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc4q7\" (UniqueName: \"kubernetes.io/projected/02ebaec4-8e37-483f-8b86-1a793a44572b-kube-api-access-wc4q7\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.289133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-utilities\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.289285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-catalog-content\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.390817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-catalog-content\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.390929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc4q7\" (UniqueName: \"kubernetes.io/projected/02ebaec4-8e37-483f-8b86-1a793a44572b-kube-api-access-wc4q7\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.391018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-utilities\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.391778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-utilities\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.391795 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-catalog-content\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.423522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc4q7\" (UniqueName: \"kubernetes.io/projected/02ebaec4-8e37-483f-8b86-1a793a44572b-kube-api-access-wc4q7\") pod \"community-operators-nnpvr\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.594104 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.684145 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8557458f49-z25rs" Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.709133 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8557458f49-z25rs"] Nov 29 02:40:51 crc kubenswrapper[4749]: I1129 02:40:51.720759 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8557458f49-z25rs"] Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.135960 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.203106 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-config-data\") pod \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.203173 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-scripts\") pod \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.203301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-combined-ca-bundle\") pod \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.203327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6xtl\" (UniqueName: \"kubernetes.io/projected/5c4a187b-e26a-45b8-83d7-a9507f71ac24-kube-api-access-d6xtl\") pod \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.203365 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-fernet-keys\") pod \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.203441 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-credential-keys\") pod \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\" (UID: \"5c4a187b-e26a-45b8-83d7-a9507f71ac24\") " Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.209174 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5c4a187b-e26a-45b8-83d7-a9507f71ac24" (UID: "5c4a187b-e26a-45b8-83d7-a9507f71ac24"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.209299 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5c4a187b-e26a-45b8-83d7-a9507f71ac24" (UID: "5c4a187b-e26a-45b8-83d7-a9507f71ac24"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.209377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-scripts" (OuterVolumeSpecName: "scripts") pod "5c4a187b-e26a-45b8-83d7-a9507f71ac24" (UID: "5c4a187b-e26a-45b8-83d7-a9507f71ac24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.219975 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4a187b-e26a-45b8-83d7-a9507f71ac24-kube-api-access-d6xtl" (OuterVolumeSpecName: "kube-api-access-d6xtl") pod "5c4a187b-e26a-45b8-83d7-a9507f71ac24" (UID: "5c4a187b-e26a-45b8-83d7-a9507f71ac24"). InnerVolumeSpecName "kube-api-access-d6xtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.221578 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnpvr"] Nov 29 02:40:52 crc kubenswrapper[4749]: W1129 02:40:52.231411 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02ebaec4_8e37_483f_8b86_1a793a44572b.slice/crio-e8d9c375fd0129c14475495b07a2883651763c84304276fbbc8cd9253a0e1823 WatchSource:0}: Error finding container e8d9c375fd0129c14475495b07a2883651763c84304276fbbc8cd9253a0e1823: Status 404 returned error can't find the container with id e8d9c375fd0129c14475495b07a2883651763c84304276fbbc8cd9253a0e1823 Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.240585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-config-data" (OuterVolumeSpecName: "config-data") pod "5c4a187b-e26a-45b8-83d7-a9507f71ac24" (UID: "5c4a187b-e26a-45b8-83d7-a9507f71ac24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.251095 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c4a187b-e26a-45b8-83d7-a9507f71ac24" (UID: "5c4a187b-e26a-45b8-83d7-a9507f71ac24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.305415 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.305455 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.305471 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.305484 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.305497 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c4a187b-e26a-45b8-83d7-a9507f71ac24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.305509 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6xtl\" (UniqueName: \"kubernetes.io/projected/5c4a187b-e26a-45b8-83d7-a9507f71ac24-kube-api-access-d6xtl\") on node \"crc\" DevicePath \"\"" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.700077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-27gd8" event={"ID":"5c4a187b-e26a-45b8-83d7-a9507f71ac24","Type":"ContainerDied","Data":"d8b11c5cbac2c04dce6a891bda686636d937b33fadbd2d2f28018fb85fcee3fc"} Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.700155 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8b11c5cbac2c04dce6a891bda686636d937b33fadbd2d2f28018fb85fcee3fc" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.700106 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-27gd8" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.703307 4749 generic.go:334] "Generic (PLEG): container finished" podID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerID="1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e" exitCode=0 Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.703344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnpvr" event={"ID":"02ebaec4-8e37-483f-8b86-1a793a44572b","Type":"ContainerDied","Data":"1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e"} Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.703402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnpvr" event={"ID":"02ebaec4-8e37-483f-8b86-1a793a44572b","Type":"ContainerStarted","Data":"e8d9c375fd0129c14475495b07a2883651763c84304276fbbc8cd9253a0e1823"} Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.806810 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-595f8844b9-gpbrs"] Nov 29 02:40:52 crc kubenswrapper[4749]: E1129 02:40:52.807423 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4a187b-e26a-45b8-83d7-a9507f71ac24" containerName="keystone-bootstrap" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.807457 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4a187b-e26a-45b8-83d7-a9507f71ac24" containerName="keystone-bootstrap" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.807781 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4a187b-e26a-45b8-83d7-a9507f71ac24" containerName="keystone-bootstrap" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.808743 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.811136 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.811866 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.812299 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.815021 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wzt9m" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.817503 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-595f8844b9-gpbrs"] Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.914695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-fernet-keys\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.914749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-combined-ca-bundle\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.914777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj556\" (UniqueName: \"kubernetes.io/projected/80d0cbf8-416f-4012-9123-58c4921deb36-kube-api-access-cj556\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.914801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-credential-keys\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.915019 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-scripts\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:52 crc kubenswrapper[4749]: I1129 02:40:52.915176 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-config-data\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.016821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-fernet-keys\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.017145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-combined-ca-bundle\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.017188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj556\" (UniqueName: \"kubernetes.io/projected/80d0cbf8-416f-4012-9123-58c4921deb36-kube-api-access-cj556\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.017293 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-credential-keys\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.017337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-scripts\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.017412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-config-data\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.023043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-combined-ca-bundle\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.023409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-config-data\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.024639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-credential-keys\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.025090 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-fernet-keys\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.027521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80d0cbf8-416f-4012-9123-58c4921deb36-scripts\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.050428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj556\" (UniqueName: \"kubernetes.io/projected/80d0cbf8-416f-4012-9123-58c4921deb36-kube-api-access-cj556\") pod \"keystone-595f8844b9-gpbrs\" (UID: \"80d0cbf8-416f-4012-9123-58c4921deb36\") " pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.092078 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19e3fcc4-1774-442f-bbf0-4483c8c86eaa" path="/var/lib/kubelet/pods/19e3fcc4-1774-442f-bbf0-4483c8c86eaa/volumes" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.153919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:53 crc kubenswrapper[4749]: I1129 02:40:53.691335 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-595f8844b9-gpbrs"] Nov 29 02:40:54 crc kubenswrapper[4749]: I1129 02:40:54.732322 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-595f8844b9-gpbrs" event={"ID":"80d0cbf8-416f-4012-9123-58c4921deb36","Type":"ContainerStarted","Data":"9942b594615c301aa24b81c57c1e10b8805d81f69e5aeb075c7314c60dc92c38"} Nov 29 02:40:54 crc kubenswrapper[4749]: I1129 02:40:54.732782 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:40:54 crc kubenswrapper[4749]: I1129 02:40:54.732848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-595f8844b9-gpbrs" event={"ID":"80d0cbf8-416f-4012-9123-58c4921deb36","Type":"ContainerStarted","Data":"eebc302dd85c3013a3f8c1805ef59a096d768c0dc277e5bf2f46a842b6eb291d"} Nov 29 02:40:54 crc kubenswrapper[4749]: I1129 02:40:54.736657 4749 generic.go:334] "Generic (PLEG): container finished" podID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerID="034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d" exitCode=0 Nov 29 02:40:54 crc kubenswrapper[4749]: I1129 02:40:54.736734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnpvr" event={"ID":"02ebaec4-8e37-483f-8b86-1a793a44572b","Type":"ContainerDied","Data":"034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d"} Nov 29 02:40:54 crc kubenswrapper[4749]: I1129 02:40:54.763650 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-595f8844b9-gpbrs" podStartSLOduration=2.763606447 podStartE2EDuration="2.763606447s" podCreationTimestamp="2025-11-29 02:40:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:40:54.761358042 +0000 UTC m=+5397.933507969" watchObservedRunningTime="2025-11-29 02:40:54.763606447 +0000 UTC m=+5397.935756314" Nov 29 02:40:55 crc kubenswrapper[4749]: I1129 02:40:55.748720 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnpvr" event={"ID":"02ebaec4-8e37-483f-8b86-1a793a44572b","Type":"ContainerStarted","Data":"d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f"} Nov 29 02:40:55 crc kubenswrapper[4749]: I1129 02:40:55.786719 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nnpvr" podStartSLOduration=2.303305215 podStartE2EDuration="4.786695524s" podCreationTimestamp="2025-11-29 02:40:51 +0000 UTC" firstStartedPulling="2025-11-29 02:40:52.705477924 +0000 UTC m=+5395.877627811" lastFinishedPulling="2025-11-29 02:40:55.188868263 +0000 UTC m=+5398.361018120" observedRunningTime="2025-11-29 02:40:55.781224901 +0000 UTC m=+5398.953374788" watchObservedRunningTime="2025-11-29 02:40:55.786695524 +0000 UTC m=+5398.958845391" Nov 29 02:41:01 crc kubenswrapper[4749]: I1129 02:41:01.594160 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:41:01 crc kubenswrapper[4749]: I1129 02:41:01.595986 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:41:01 crc kubenswrapper[4749]: I1129 02:41:01.647488 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:41:01 crc kubenswrapper[4749]: I1129 02:41:01.879608 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:41:01 crc kubenswrapper[4749]: I1129 02:41:01.945035 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnpvr"] Nov 29 02:41:03 crc kubenswrapper[4749]: I1129 02:41:03.851464 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nnpvr" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerName="registry-server" containerID="cri-o://d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f" gracePeriod=2 Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.406606 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.463430 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-catalog-content\") pod \"02ebaec4-8e37-483f-8b86-1a793a44572b\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.463541 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc4q7\" (UniqueName: \"kubernetes.io/projected/02ebaec4-8e37-483f-8b86-1a793a44572b-kube-api-access-wc4q7\") pod \"02ebaec4-8e37-483f-8b86-1a793a44572b\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.463740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-utilities\") pod \"02ebaec4-8e37-483f-8b86-1a793a44572b\" (UID: \"02ebaec4-8e37-483f-8b86-1a793a44572b\") " Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.465559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-utilities" (OuterVolumeSpecName: "utilities") pod "02ebaec4-8e37-483f-8b86-1a793a44572b" (UID: "02ebaec4-8e37-483f-8b86-1a793a44572b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.481185 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ebaec4-8e37-483f-8b86-1a793a44572b-kube-api-access-wc4q7" (OuterVolumeSpecName: "kube-api-access-wc4q7") pod "02ebaec4-8e37-483f-8b86-1a793a44572b" (UID: "02ebaec4-8e37-483f-8b86-1a793a44572b"). InnerVolumeSpecName "kube-api-access-wc4q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.528123 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02ebaec4-8e37-483f-8b86-1a793a44572b" (UID: "02ebaec4-8e37-483f-8b86-1a793a44572b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.566073 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.566106 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02ebaec4-8e37-483f-8b86-1a793a44572b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.566120 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc4q7\" (UniqueName: \"kubernetes.io/projected/02ebaec4-8e37-483f-8b86-1a793a44572b-kube-api-access-wc4q7\") on node \"crc\" DevicePath \"\"" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.863214 4749 generic.go:334] "Generic (PLEG): container finished" podID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerID="d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f" exitCode=0 Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.863277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnpvr" event={"ID":"02ebaec4-8e37-483f-8b86-1a793a44572b","Type":"ContainerDied","Data":"d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f"} Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.863291 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnpvr" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.863311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnpvr" event={"ID":"02ebaec4-8e37-483f-8b86-1a793a44572b","Type":"ContainerDied","Data":"e8d9c375fd0129c14475495b07a2883651763c84304276fbbc8cd9253a0e1823"} Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.863340 4749 scope.go:117] "RemoveContainer" containerID="d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.884276 4749 scope.go:117] "RemoveContainer" containerID="034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.909618 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnpvr"] Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.916811 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nnpvr"] Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.941390 4749 scope.go:117] "RemoveContainer" containerID="1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.972590 4749 scope.go:117] "RemoveContainer" containerID="d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f" Nov 29 02:41:04 crc kubenswrapper[4749]: E1129 02:41:04.975501 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f\": container with ID starting with d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f not found: ID does not exist" containerID="d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.975586 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f"} err="failed to get container status \"d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f\": rpc error: code = NotFound desc = could not find container \"d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f\": container with ID starting with d7b67a0744c312de0461c8207541fb08d0099beb262587596934032b0db6f50f not found: ID does not exist" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.975640 4749 scope.go:117] "RemoveContainer" containerID="034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d" Nov 29 02:41:04 crc kubenswrapper[4749]: E1129 02:41:04.976103 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d\": container with ID starting with 034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d not found: ID does not exist" containerID="034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.976140 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d"} err="failed to get container status \"034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d\": rpc error: code = NotFound desc = could not find container \"034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d\": container with ID starting with 034761c0885f75c1eb8fe240a8aaa9cf670f006cd6abe3939d6b373f07f9b07d not found: ID does not exist" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.976170 4749 scope.go:117] "RemoveContainer" containerID="1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e" Nov 29 02:41:04 crc kubenswrapper[4749]: E1129 02:41:04.976414 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e\": container with ID starting with 1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e not found: ID does not exist" containerID="1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e" Nov 29 02:41:04 crc kubenswrapper[4749]: I1129 02:41:04.976474 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e"} err="failed to get container status \"1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e\": rpc error: code = NotFound desc = could not find container \"1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e\": container with ID starting with 1471e11685f646802b62a5e35328fd0ad58eeafb5355d325b73f5cec5a2bd67e not found: ID does not exist" Nov 29 02:41:05 crc kubenswrapper[4749]: I1129 02:41:05.092329 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" path="/var/lib/kubelet/pods/02ebaec4-8e37-483f-8b86-1a793a44572b/volumes" Nov 29 02:41:14 crc kubenswrapper[4749]: I1129 02:41:14.120866 4749 scope.go:117] "RemoveContainer" containerID="f7671c09b5aea297d95ec8a96829f72d9a8ea2cf989f96bb0766b86cf608358a" Nov 29 02:41:14 crc kubenswrapper[4749]: I1129 02:41:14.154666 4749 scope.go:117] "RemoveContainer" containerID="1f4fbabe45131a1b877202013712e9c24d10996a5ce276ee9d7ab0b91ef87d64" Nov 29 02:41:14 crc kubenswrapper[4749]: I1129 02:41:14.221278 4749 scope.go:117] "RemoveContainer" containerID="d110251ce2f74fbcbb495fbe4f6e9ea91a32b6cea2d96ce51646fa2c8080ecdd" Nov 29 02:41:24 crc kubenswrapper[4749]: I1129 02:41:24.569970 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-595f8844b9-gpbrs" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.482630 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 29 02:41:28 crc kubenswrapper[4749]: E1129 02:41:28.483634 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerName="extract-utilities" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.483651 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerName="extract-utilities" Nov 29 02:41:28 crc kubenswrapper[4749]: E1129 02:41:28.483680 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerName="registry-server" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.483689 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerName="registry-server" Nov 29 02:41:28 crc kubenswrapper[4749]: E1129 02:41:28.483706 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerName="extract-content" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.483714 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerName="extract-content" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.483913 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="02ebaec4-8e37-483f-8b86-1a793a44572b" containerName="registry-server" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.484561 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.493330 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.493387 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.493401 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-pd26z" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.511589 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.603668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config-secret\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.603745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.603825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlnm8\" (UniqueName: \"kubernetes.io/projected/6ebecb6a-b8de-43fa-b228-7008102bd827-kube-api-access-hlnm8\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.705095 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config-secret\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.705186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.705301 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlnm8\" (UniqueName: \"kubernetes.io/projected/6ebecb6a-b8de-43fa-b228-7008102bd827-kube-api-access-hlnm8\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.706531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.714400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config-secret\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.731013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlnm8\" (UniqueName: \"kubernetes.io/projected/6ebecb6a-b8de-43fa-b228-7008102bd827-kube-api-access-hlnm8\") pod \"openstackclient\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " pod="openstack/openstackclient" Nov 29 02:41:28 crc kubenswrapper[4749]: I1129 02:41:28.809883 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 02:41:29 crc kubenswrapper[4749]: I1129 02:41:29.306624 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 02:41:30 crc kubenswrapper[4749]: I1129 02:41:30.149394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6ebecb6a-b8de-43fa-b228-7008102bd827","Type":"ContainerStarted","Data":"85fa615830a4186f7b1e92991aba88cdc6a2c2bced5e044e32b8eb9fb8d184dc"} Nov 29 02:41:30 crc kubenswrapper[4749]: I1129 02:41:30.150388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6ebecb6a-b8de-43fa-b228-7008102bd827","Type":"ContainerStarted","Data":"e91c1c684e8c2d4bbbd44ea69e4363acb9922b1678e76508f0991e1c149602e5"} Nov 29 02:41:30 crc kubenswrapper[4749]: I1129 02:41:30.176796 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.176771349 podStartE2EDuration="2.176771349s" podCreationTimestamp="2025-11-29 02:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:41:30.170957777 +0000 UTC m=+5433.343107644" watchObservedRunningTime="2025-11-29 02:41:30.176771349 +0000 UTC m=+5433.348921266" Nov 29 02:41:55 crc kubenswrapper[4749]: I1129 02:41:55.374123 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:41:55 crc kubenswrapper[4749]: I1129 02:41:55.374871 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:42:01 crc kubenswrapper[4749]: E1129 02:42:01.792306 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:41058->38.102.83.30:35737: write tcp 38.102.83.30:41058->38.102.83.30:35737: write: broken pipe Nov 29 02:42:25 crc kubenswrapper[4749]: I1129 02:42:25.374389 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:42:25 crc kubenswrapper[4749]: I1129 02:42:25.375007 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:42:55 crc kubenswrapper[4749]: I1129 02:42:55.374134 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:42:55 crc kubenswrapper[4749]: I1129 02:42:55.374836 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:42:55 crc kubenswrapper[4749]: I1129 02:42:55.374892 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:42:55 crc kubenswrapper[4749]: I1129 02:42:55.375605 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f517e2447b6626d440180682d23cb7d40e59343ddf40e5b871367149339d323"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:42:55 crc kubenswrapper[4749]: I1129 02:42:55.375702 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://0f517e2447b6626d440180682d23cb7d40e59343ddf40e5b871367149339d323" gracePeriod=600 Nov 29 02:42:56 crc kubenswrapper[4749]: I1129 02:42:56.061806 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="0f517e2447b6626d440180682d23cb7d40e59343ddf40e5b871367149339d323" exitCode=0 Nov 29 02:42:56 crc kubenswrapper[4749]: I1129 02:42:56.061903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"0f517e2447b6626d440180682d23cb7d40e59343ddf40e5b871367149339d323"} Nov 29 02:42:56 crc kubenswrapper[4749]: I1129 02:42:56.062170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd"} Nov 29 02:42:56 crc kubenswrapper[4749]: I1129 02:42:56.062216 4749 scope.go:117] "RemoveContainer" containerID="21023d1be0e964969b800b65fcca007eaa2dcaf0059e1a742baa3249022b9e24" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.184672 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-j7dsd"] Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.186587 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.192570 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j7dsd"] Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.290870 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4cdd-account-create-update-l6ljq"] Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.292806 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.294727 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.298241 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4cdd-account-create-update-l6ljq"] Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.351153 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ccb818-8eda-414d-9692-ded50b74bd5e-operator-scripts\") pod \"barbican-db-create-j7dsd\" (UID: \"28ccb818-8eda-414d-9692-ded50b74bd5e\") " pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.351397 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckvzc\" (UniqueName: \"kubernetes.io/projected/28ccb818-8eda-414d-9692-ded50b74bd5e-kube-api-access-ckvzc\") pod \"barbican-db-create-j7dsd\" (UID: \"28ccb818-8eda-414d-9692-ded50b74bd5e\") " pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.453366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckvzc\" (UniqueName: \"kubernetes.io/projected/28ccb818-8eda-414d-9692-ded50b74bd5e-kube-api-access-ckvzc\") pod \"barbican-db-create-j7dsd\" (UID: \"28ccb818-8eda-414d-9692-ded50b74bd5e\") " pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.453664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cecf171-c49e-4dd3-be48-a490e396333a-operator-scripts\") pod \"barbican-4cdd-account-create-update-l6ljq\" (UID: \"7cecf171-c49e-4dd3-be48-a490e396333a\") " pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.453869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2lg\" (UniqueName: \"kubernetes.io/projected/7cecf171-c49e-4dd3-be48-a490e396333a-kube-api-access-vk2lg\") pod \"barbican-4cdd-account-create-update-l6ljq\" (UID: \"7cecf171-c49e-4dd3-be48-a490e396333a\") " pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.453949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ccb818-8eda-414d-9692-ded50b74bd5e-operator-scripts\") pod \"barbican-db-create-j7dsd\" (UID: \"28ccb818-8eda-414d-9692-ded50b74bd5e\") " pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.455148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ccb818-8eda-414d-9692-ded50b74bd5e-operator-scripts\") pod \"barbican-db-create-j7dsd\" (UID: \"28ccb818-8eda-414d-9692-ded50b74bd5e\") " pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.489000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckvzc\" (UniqueName: \"kubernetes.io/projected/28ccb818-8eda-414d-9692-ded50b74bd5e-kube-api-access-ckvzc\") pod \"barbican-db-create-j7dsd\" (UID: \"28ccb818-8eda-414d-9692-ded50b74bd5e\") " pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.501810 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.555092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cecf171-c49e-4dd3-be48-a490e396333a-operator-scripts\") pod \"barbican-4cdd-account-create-update-l6ljq\" (UID: \"7cecf171-c49e-4dd3-be48-a490e396333a\") " pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.555190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2lg\" (UniqueName: \"kubernetes.io/projected/7cecf171-c49e-4dd3-be48-a490e396333a-kube-api-access-vk2lg\") pod \"barbican-4cdd-account-create-update-l6ljq\" (UID: \"7cecf171-c49e-4dd3-be48-a490e396333a\") " pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.556085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cecf171-c49e-4dd3-be48-a490e396333a-operator-scripts\") pod \"barbican-4cdd-account-create-update-l6ljq\" (UID: \"7cecf171-c49e-4dd3-be48-a490e396333a\") " pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.576883 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2lg\" (UniqueName: \"kubernetes.io/projected/7cecf171-c49e-4dd3-be48-a490e396333a-kube-api-access-vk2lg\") pod \"barbican-4cdd-account-create-update-l6ljq\" (UID: \"7cecf171-c49e-4dd3-be48-a490e396333a\") " pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.646514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.785685 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j7dsd"] Nov 29 02:43:07 crc kubenswrapper[4749]: I1129 02:43:07.915914 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4cdd-account-create-update-l6ljq"] Nov 29 02:43:07 crc kubenswrapper[4749]: W1129 02:43:07.921524 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cecf171_c49e_4dd3_be48_a490e396333a.slice/crio-c89cba6ff7cc6b20906c787a17e1f8191c6d67eb189245261930d7ffed607088 WatchSource:0}: Error finding container c89cba6ff7cc6b20906c787a17e1f8191c6d67eb189245261930d7ffed607088: Status 404 returned error can't find the container with id c89cba6ff7cc6b20906c787a17e1f8191c6d67eb189245261930d7ffed607088 Nov 29 02:43:08 crc kubenswrapper[4749]: I1129 02:43:08.207630 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4cdd-account-create-update-l6ljq" event={"ID":"7cecf171-c49e-4dd3-be48-a490e396333a","Type":"ContainerStarted","Data":"4dbc06eb7a24c743115062c55a43b788829cbebc53f725d4cfd5fa4cea148a16"} Nov 29 02:43:08 crc kubenswrapper[4749]: I1129 02:43:08.208023 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4cdd-account-create-update-l6ljq" event={"ID":"7cecf171-c49e-4dd3-be48-a490e396333a","Type":"ContainerStarted","Data":"c89cba6ff7cc6b20906c787a17e1f8191c6d67eb189245261930d7ffed607088"} Nov 29 02:43:08 crc kubenswrapper[4749]: I1129 02:43:08.209921 4749 generic.go:334] "Generic (PLEG): container finished" podID="28ccb818-8eda-414d-9692-ded50b74bd5e" containerID="440da4586c29b79db000ec02af6c4bd1a809ad83cd072266cb8db2c5162dac90" exitCode=0 Nov 29 02:43:08 crc kubenswrapper[4749]: I1129 02:43:08.209977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7dsd" event={"ID":"28ccb818-8eda-414d-9692-ded50b74bd5e","Type":"ContainerDied","Data":"440da4586c29b79db000ec02af6c4bd1a809ad83cd072266cb8db2c5162dac90"} Nov 29 02:43:08 crc kubenswrapper[4749]: I1129 02:43:08.210045 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7dsd" event={"ID":"28ccb818-8eda-414d-9692-ded50b74bd5e","Type":"ContainerStarted","Data":"c065c324f67c81cd543314927a38d6fc336f038ba5c1a4b1f1ea9dd06e6e1f27"} Nov 29 02:43:08 crc kubenswrapper[4749]: I1129 02:43:08.225464 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-4cdd-account-create-update-l6ljq" podStartSLOduration=1.225447018 podStartE2EDuration="1.225447018s" podCreationTimestamp="2025-11-29 02:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:08.221756048 +0000 UTC m=+5531.393905915" watchObservedRunningTime="2025-11-29 02:43:08.225447018 +0000 UTC m=+5531.397596865" Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.242049 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cecf171-c49e-4dd3-be48-a490e396333a" containerID="4dbc06eb7a24c743115062c55a43b788829cbebc53f725d4cfd5fa4cea148a16" exitCode=0 Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.242121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4cdd-account-create-update-l6ljq" event={"ID":"7cecf171-c49e-4dd3-be48-a490e396333a","Type":"ContainerDied","Data":"4dbc06eb7a24c743115062c55a43b788829cbebc53f725d4cfd5fa4cea148a16"} Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.629125 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.809457 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ccb818-8eda-414d-9692-ded50b74bd5e-operator-scripts\") pod \"28ccb818-8eda-414d-9692-ded50b74bd5e\" (UID: \"28ccb818-8eda-414d-9692-ded50b74bd5e\") " Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.809632 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckvzc\" (UniqueName: \"kubernetes.io/projected/28ccb818-8eda-414d-9692-ded50b74bd5e-kube-api-access-ckvzc\") pod \"28ccb818-8eda-414d-9692-ded50b74bd5e\" (UID: \"28ccb818-8eda-414d-9692-ded50b74bd5e\") " Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.809990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ccb818-8eda-414d-9692-ded50b74bd5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28ccb818-8eda-414d-9692-ded50b74bd5e" (UID: "28ccb818-8eda-414d-9692-ded50b74bd5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.810223 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28ccb818-8eda-414d-9692-ded50b74bd5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.819409 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ccb818-8eda-414d-9692-ded50b74bd5e-kube-api-access-ckvzc" (OuterVolumeSpecName: "kube-api-access-ckvzc") pod "28ccb818-8eda-414d-9692-ded50b74bd5e" (UID: "28ccb818-8eda-414d-9692-ded50b74bd5e"). InnerVolumeSpecName "kube-api-access-ckvzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:43:09 crc kubenswrapper[4749]: I1129 02:43:09.911531 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckvzc\" (UniqueName: \"kubernetes.io/projected/28ccb818-8eda-414d-9692-ded50b74bd5e-kube-api-access-ckvzc\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.258313 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7dsd" event={"ID":"28ccb818-8eda-414d-9692-ded50b74bd5e","Type":"ContainerDied","Data":"c065c324f67c81cd543314927a38d6fc336f038ba5c1a4b1f1ea9dd06e6e1f27"} Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.260514 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c065c324f67c81cd543314927a38d6fc336f038ba5c1a4b1f1ea9dd06e6e1f27" Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.258396 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7dsd" Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.636223 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.726254 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk2lg\" (UniqueName: \"kubernetes.io/projected/7cecf171-c49e-4dd3-be48-a490e396333a-kube-api-access-vk2lg\") pod \"7cecf171-c49e-4dd3-be48-a490e396333a\" (UID: \"7cecf171-c49e-4dd3-be48-a490e396333a\") " Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.726659 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cecf171-c49e-4dd3-be48-a490e396333a-operator-scripts\") pod \"7cecf171-c49e-4dd3-be48-a490e396333a\" (UID: \"7cecf171-c49e-4dd3-be48-a490e396333a\") " Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.727591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cecf171-c49e-4dd3-be48-a490e396333a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7cecf171-c49e-4dd3-be48-a490e396333a" (UID: "7cecf171-c49e-4dd3-be48-a490e396333a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.733445 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cecf171-c49e-4dd3-be48-a490e396333a-kube-api-access-vk2lg" (OuterVolumeSpecName: "kube-api-access-vk2lg") pod "7cecf171-c49e-4dd3-be48-a490e396333a" (UID: "7cecf171-c49e-4dd3-be48-a490e396333a"). InnerVolumeSpecName "kube-api-access-vk2lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.828767 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk2lg\" (UniqueName: \"kubernetes.io/projected/7cecf171-c49e-4dd3-be48-a490e396333a-kube-api-access-vk2lg\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:10 crc kubenswrapper[4749]: I1129 02:43:10.828803 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cecf171-c49e-4dd3-be48-a490e396333a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:11 crc kubenswrapper[4749]: I1129 02:43:11.272057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4cdd-account-create-update-l6ljq" event={"ID":"7cecf171-c49e-4dd3-be48-a490e396333a","Type":"ContainerDied","Data":"c89cba6ff7cc6b20906c787a17e1f8191c6d67eb189245261930d7ffed607088"} Nov 29 02:43:11 crc kubenswrapper[4749]: I1129 02:43:11.272093 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c89cba6ff7cc6b20906c787a17e1f8191c6d67eb189245261930d7ffed607088" Nov 29 02:43:11 crc kubenswrapper[4749]: I1129 02:43:11.272172 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4cdd-account-create-update-l6ljq" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.571888 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mjrvg"] Nov 29 02:43:12 crc kubenswrapper[4749]: E1129 02:43:12.572520 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ccb818-8eda-414d-9692-ded50b74bd5e" containerName="mariadb-database-create" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.572533 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ccb818-8eda-414d-9692-ded50b74bd5e" containerName="mariadb-database-create" Nov 29 02:43:12 crc kubenswrapper[4749]: E1129 02:43:12.572570 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cecf171-c49e-4dd3-be48-a490e396333a" containerName="mariadb-account-create-update" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.572577 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cecf171-c49e-4dd3-be48-a490e396333a" containerName="mariadb-account-create-update" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.572717 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ccb818-8eda-414d-9692-ded50b74bd5e" containerName="mariadb-database-create" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.572727 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cecf171-c49e-4dd3-be48-a490e396333a" containerName="mariadb-account-create-update" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.573334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.575215 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5vjzw" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.575508 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.592146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mjrvg"] Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.660423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th6zr\" (UniqueName: \"kubernetes.io/projected/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-kube-api-access-th6zr\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.660692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-db-sync-config-data\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.660790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-combined-ca-bundle\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.761703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-db-sync-config-data\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.761751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-combined-ca-bundle\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.761850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th6zr\" (UniqueName: \"kubernetes.io/projected/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-kube-api-access-th6zr\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.767032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-db-sync-config-data\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.769294 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-combined-ca-bundle\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.787245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th6zr\" (UniqueName: \"kubernetes.io/projected/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-kube-api-access-th6zr\") pod \"barbican-db-sync-mjrvg\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:12 crc kubenswrapper[4749]: I1129 02:43:12.893439 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:13 crc kubenswrapper[4749]: W1129 02:43:13.357652 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62c8fbfd_ceea_44ee_9b20_b298bd7be81a.slice/crio-b1ea0f88f1cf6755d3bb435c23bdc1bd87a5c01ba663fef929376eb6fe2a5102 WatchSource:0}: Error finding container b1ea0f88f1cf6755d3bb435c23bdc1bd87a5c01ba663fef929376eb6fe2a5102: Status 404 returned error can't find the container with id b1ea0f88f1cf6755d3bb435c23bdc1bd87a5c01ba663fef929376eb6fe2a5102 Nov 29 02:43:13 crc kubenswrapper[4749]: I1129 02:43:13.358375 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mjrvg"] Nov 29 02:43:14 crc kubenswrapper[4749]: I1129 02:43:14.307160 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mjrvg" event={"ID":"62c8fbfd-ceea-44ee-9b20-b298bd7be81a","Type":"ContainerStarted","Data":"fb1f666098c5292e037c4dd5ffd719561c294f91fdb39f539dc5bfbe2b663775"} Nov 29 02:43:14 crc kubenswrapper[4749]: I1129 02:43:14.308501 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mjrvg" event={"ID":"62c8fbfd-ceea-44ee-9b20-b298bd7be81a","Type":"ContainerStarted","Data":"b1ea0f88f1cf6755d3bb435c23bdc1bd87a5c01ba663fef929376eb6fe2a5102"} Nov 29 02:43:14 crc kubenswrapper[4749]: I1129 02:43:14.324790 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mjrvg" podStartSLOduration=2.324772853 podStartE2EDuration="2.324772853s" podCreationTimestamp="2025-11-29 02:43:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:14.323545073 +0000 UTC m=+5537.495694940" watchObservedRunningTime="2025-11-29 02:43:14.324772853 +0000 UTC m=+5537.496922710" Nov 29 02:43:15 crc kubenswrapper[4749]: I1129 02:43:15.322824 4749 generic.go:334] "Generic (PLEG): container finished" podID="62c8fbfd-ceea-44ee-9b20-b298bd7be81a" containerID="fb1f666098c5292e037c4dd5ffd719561c294f91fdb39f539dc5bfbe2b663775" exitCode=0 Nov 29 02:43:15 crc kubenswrapper[4749]: I1129 02:43:15.322886 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mjrvg" event={"ID":"62c8fbfd-ceea-44ee-9b20-b298bd7be81a","Type":"ContainerDied","Data":"fb1f666098c5292e037c4dd5ffd719561c294f91fdb39f539dc5bfbe2b663775"} Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.696378 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.835178 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th6zr\" (UniqueName: \"kubernetes.io/projected/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-kube-api-access-th6zr\") pod \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.835318 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-db-sync-config-data\") pod \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.835494 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-combined-ca-bundle\") pod \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\" (UID: \"62c8fbfd-ceea-44ee-9b20-b298bd7be81a\") " Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.842159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-kube-api-access-th6zr" (OuterVolumeSpecName: "kube-api-access-th6zr") pod "62c8fbfd-ceea-44ee-9b20-b298bd7be81a" (UID: "62c8fbfd-ceea-44ee-9b20-b298bd7be81a"). InnerVolumeSpecName "kube-api-access-th6zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.845328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "62c8fbfd-ceea-44ee-9b20-b298bd7be81a" (UID: "62c8fbfd-ceea-44ee-9b20-b298bd7be81a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.869744 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62c8fbfd-ceea-44ee-9b20-b298bd7be81a" (UID: "62c8fbfd-ceea-44ee-9b20-b298bd7be81a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.940349 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.940679 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:16 crc kubenswrapper[4749]: I1129 02:43:16.940830 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th6zr\" (UniqueName: \"kubernetes.io/projected/62c8fbfd-ceea-44ee-9b20-b298bd7be81a-kube-api-access-th6zr\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.346412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mjrvg" event={"ID":"62c8fbfd-ceea-44ee-9b20-b298bd7be81a","Type":"ContainerDied","Data":"b1ea0f88f1cf6755d3bb435c23bdc1bd87a5c01ba663fef929376eb6fe2a5102"} Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.346472 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1ea0f88f1cf6755d3bb435c23bdc1bd87a5c01ba663fef929376eb6fe2a5102" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.346892 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mjrvg" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.641107 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-694fccbdcd-wbj66"] Nov 29 02:43:17 crc kubenswrapper[4749]: E1129 02:43:17.641592 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c8fbfd-ceea-44ee-9b20-b298bd7be81a" containerName="barbican-db-sync" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.641613 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c8fbfd-ceea-44ee-9b20-b298bd7be81a" containerName="barbican-db-sync" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.641821 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c8fbfd-ceea-44ee-9b20-b298bd7be81a" containerName="barbican-db-sync" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.642933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.646691 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.646959 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5vjzw" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.653406 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.659601 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5f6ddb5fdc-v9j2j"] Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.660958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.663806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.673954 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f6ddb5fdc-v9j2j"] Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.685274 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-694fccbdcd-wbj66"] Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.748631 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d9f4c8cf-jrgc6"] Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.750777 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757390 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-logs\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-combined-ca-bundle\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-config-data-custom\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2d9q\" (UniqueName: \"kubernetes.io/projected/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-kube-api-access-h2d9q\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757750 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c386cbf4-8348-49c7-b1d0-35e519fe20e6-logs\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-config-data\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glsln\" (UniqueName: \"kubernetes.io/projected/c386cbf4-8348-49c7-b1d0-35e519fe20e6-kube-api-access-glsln\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757854 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-config-data-custom\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-config-data\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.757897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-combined-ca-bundle\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.763010 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d9f4c8cf-jrgc6"] Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.808508 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55dbb95c78-qbs5z"] Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.809882 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.812394 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.830955 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55dbb95c78-qbs5z"] Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.859677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-config-data\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.859827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-combined-ca-bundle\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.859944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-logs\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860014 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-combined-ca-bundle\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-sb\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-config-data-custom\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2d9q\" (UniqueName: \"kubernetes.io/projected/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-kube-api-access-h2d9q\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-dns-svc\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-config\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c386cbf4-8348-49c7-b1d0-35e519fe20e6-logs\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-nb\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-config-data\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257ns\" (UniqueName: \"kubernetes.io/projected/a0e717dd-6190-4504-8d53-fd190b2112ab-kube-api-access-257ns\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860792 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glsln\" (UniqueName: \"kubernetes.io/projected/c386cbf4-8348-49c7-b1d0-35e519fe20e6-kube-api-access-glsln\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-config-data-custom\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.861554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c386cbf4-8348-49c7-b1d0-35e519fe20e6-logs\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.860564 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-logs\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.864960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-config-data-custom\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.865681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-combined-ca-bundle\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.865818 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-combined-ca-bundle\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.866418 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c386cbf4-8348-49c7-b1d0-35e519fe20e6-config-data\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.867615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-config-data\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.869790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-config-data-custom\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.878642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glsln\" (UniqueName: \"kubernetes.io/projected/c386cbf4-8348-49c7-b1d0-35e519fe20e6-kube-api-access-glsln\") pod \"barbican-keystone-listener-694fccbdcd-wbj66\" (UID: \"c386cbf4-8348-49c7-b1d0-35e519fe20e6\") " pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.884149 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2d9q\" (UniqueName: \"kubernetes.io/projected/2c5e7bf3-d98e-4d90-8fad-c71017fa20c4-kube-api-access-h2d9q\") pod \"barbican-worker-5f6ddb5fdc-v9j2j\" (UID: \"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4\") " pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.962704 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czx8c\" (UniqueName: \"kubernetes.io/projected/34b6e3bf-224f-4796-844b-7b720cd27e67-kube-api-access-czx8c\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.962781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-sb\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.962808 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34b6e3bf-224f-4796-844b-7b720cd27e67-logs\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.962859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-dns-svc\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.962876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-config-data-custom\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.962923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-config\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.962948 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-nb\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.962999 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257ns\" (UniqueName: \"kubernetes.io/projected/a0e717dd-6190-4504-8d53-fd190b2112ab-kube-api-access-257ns\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.963029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-config-data\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.963089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-combined-ca-bundle\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.963782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-sb\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.964023 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-config\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.964230 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-nb\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.964236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-dns-svc\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.969262 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.981502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257ns\" (UniqueName: \"kubernetes.io/projected/a0e717dd-6190-4504-8d53-fd190b2112ab-kube-api-access-257ns\") pod \"dnsmasq-dns-9d9f4c8cf-jrgc6\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:17 crc kubenswrapper[4749]: I1129 02:43:17.985544 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.064057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czx8c\" (UniqueName: \"kubernetes.io/projected/34b6e3bf-224f-4796-844b-7b720cd27e67-kube-api-access-czx8c\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.064426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34b6e3bf-224f-4796-844b-7b720cd27e67-logs\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.064457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-config-data-custom\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.064522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-config-data\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.064567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-combined-ca-bundle\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.065769 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34b6e3bf-224f-4796-844b-7b720cd27e67-logs\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.067152 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.085551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-config-data-custom\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.085798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-combined-ca-bundle\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.089538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czx8c\" (UniqueName: \"kubernetes.io/projected/34b6e3bf-224f-4796-844b-7b720cd27e67-kube-api-access-czx8c\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.090746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34b6e3bf-224f-4796-844b-7b720cd27e67-config-data\") pod \"barbican-api-55dbb95c78-qbs5z\" (UID: \"34b6e3bf-224f-4796-844b-7b720cd27e67\") " pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.131087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.450946 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-694fccbdcd-wbj66"] Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.453754 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5f6ddb5fdc-v9j2j"] Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.611334 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55dbb95c78-qbs5z"] Nov 29 02:43:18 crc kubenswrapper[4749]: W1129 02:43:18.614402 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b6e3bf_224f_4796_844b_7b720cd27e67.slice/crio-c66f233bbf1d4f9f8c75b23aa3ad391b825c8b6bd2132fe99fdd27cf69451498 WatchSource:0}: Error finding container c66f233bbf1d4f9f8c75b23aa3ad391b825c8b6bd2132fe99fdd27cf69451498: Status 404 returned error can't find the container with id c66f233bbf1d4f9f8c75b23aa3ad391b825c8b6bd2132fe99fdd27cf69451498 Nov 29 02:43:18 crc kubenswrapper[4749]: I1129 02:43:18.639345 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d9f4c8cf-jrgc6"] Nov 29 02:43:18 crc kubenswrapper[4749]: W1129 02:43:18.643454 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0e717dd_6190_4504_8d53_fd190b2112ab.slice/crio-4a7a13c34a8b808f79513390209d2fda5b51f12bb9916c0cbd2ecf02128e440e WatchSource:0}: Error finding container 4a7a13c34a8b808f79513390209d2fda5b51f12bb9916c0cbd2ecf02128e440e: Status 404 returned error can't find the container with id 4a7a13c34a8b808f79513390209d2fda5b51f12bb9916c0cbd2ecf02128e440e Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.378715 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0e717dd-6190-4504-8d53-fd190b2112ab" containerID="b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17" exitCode=0 Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.378758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" event={"ID":"a0e717dd-6190-4504-8d53-fd190b2112ab","Type":"ContainerDied","Data":"b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.378985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" event={"ID":"a0e717dd-6190-4504-8d53-fd190b2112ab","Type":"ContainerStarted","Data":"4a7a13c34a8b808f79513390209d2fda5b51f12bb9916c0cbd2ecf02128e440e"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.388653 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" event={"ID":"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4","Type":"ContainerStarted","Data":"40e119b4b0d985121630703ad5d71f2d9a2d43e9d3076f7a330a4dfa27105efc"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.388694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" event={"ID":"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4","Type":"ContainerStarted","Data":"d37c1c9ef215085cb04c5504db1f0ca32089093494a00c5843446f4a0ca9a7c5"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.388704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" event={"ID":"2c5e7bf3-d98e-4d90-8fad-c71017fa20c4","Type":"ContainerStarted","Data":"5bf94869c09b32558f300e937df2d1013f18404892daef1406f9dfc1039eb305"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.403445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dbb95c78-qbs5z" event={"ID":"34b6e3bf-224f-4796-844b-7b720cd27e67","Type":"ContainerStarted","Data":"1c52e9c3a2ac8911fc49656aa65dd98abf4857dab37b07a68ab525baf3f69ddb"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.403488 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dbb95c78-qbs5z" event={"ID":"34b6e3bf-224f-4796-844b-7b720cd27e67","Type":"ContainerStarted","Data":"72026d1b7316ea9d0ac96d76e06f7d8994aa0b5c677a7423a63306d91d0bcfe9"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.403498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55dbb95c78-qbs5z" event={"ID":"34b6e3bf-224f-4796-844b-7b720cd27e67","Type":"ContainerStarted","Data":"c66f233bbf1d4f9f8c75b23aa3ad391b825c8b6bd2132fe99fdd27cf69451498"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.403668 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.403701 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.407580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" event={"ID":"c386cbf4-8348-49c7-b1d0-35e519fe20e6","Type":"ContainerStarted","Data":"c765b82829b2657dd1c87f8d4e106161cc576efa6b310b2a78889f7e8d20c97a"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.407612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" event={"ID":"c386cbf4-8348-49c7-b1d0-35e519fe20e6","Type":"ContainerStarted","Data":"7c32599118a9c2ebdfbf4ff9e09a6c31cc75823d080e4f723bff073f08249bac"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.407625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" event={"ID":"c386cbf4-8348-49c7-b1d0-35e519fe20e6","Type":"ContainerStarted","Data":"d57a3dfbcb0cbf16dfcbcc7044efff624aa5787db5b623e0f55df5b04b3462bb"} Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.453209 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5f6ddb5fdc-v9j2j" podStartSLOduration=2.45318036 podStartE2EDuration="2.45318036s" podCreationTimestamp="2025-11-29 02:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:19.435811428 +0000 UTC m=+5542.607961285" watchObservedRunningTime="2025-11-29 02:43:19.45318036 +0000 UTC m=+5542.625330217" Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.466712 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-694fccbdcd-wbj66" podStartSLOduration=2.4666957480000002 podStartE2EDuration="2.466695748s" podCreationTimestamp="2025-11-29 02:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:19.466631507 +0000 UTC m=+5542.638781374" watchObservedRunningTime="2025-11-29 02:43:19.466695748 +0000 UTC m=+5542.638845605" Nov 29 02:43:19 crc kubenswrapper[4749]: I1129 02:43:19.500415 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55dbb95c78-qbs5z" podStartSLOduration=2.500397417 podStartE2EDuration="2.500397417s" podCreationTimestamp="2025-11-29 02:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:19.492073225 +0000 UTC m=+5542.664223082" watchObservedRunningTime="2025-11-29 02:43:19.500397417 +0000 UTC m=+5542.672547274" Nov 29 02:43:20 crc kubenswrapper[4749]: I1129 02:43:20.420784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" event={"ID":"a0e717dd-6190-4504-8d53-fd190b2112ab","Type":"ContainerStarted","Data":"7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488"} Nov 29 02:43:20 crc kubenswrapper[4749]: I1129 02:43:20.455590 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" podStartSLOduration=3.455564123 podStartE2EDuration="3.455564123s" podCreationTimestamp="2025-11-29 02:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:20.44557139 +0000 UTC m=+5543.617721287" watchObservedRunningTime="2025-11-29 02:43:20.455564123 +0000 UTC m=+5543.627714020" Nov 29 02:43:21 crc kubenswrapper[4749]: I1129 02:43:21.430041 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:24 crc kubenswrapper[4749]: I1129 02:43:24.496762 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:25 crc kubenswrapper[4749]: I1129 02:43:25.947331 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55dbb95c78-qbs5z" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.069313 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.126415 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869b8c7567-wnls6"] Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.126654 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" podUID="5e61d941-9458-4dd0-9468-f977b67c922c" containerName="dnsmasq-dns" containerID="cri-o://bad4e54e311d51a434c7888485ddf4daa06a067b8c79f44fcaaf65859bc7c861" gracePeriod=10 Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.502916 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e61d941-9458-4dd0-9468-f977b67c922c" containerID="bad4e54e311d51a434c7888485ddf4daa06a067b8c79f44fcaaf65859bc7c861" exitCode=0 Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.503256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" event={"ID":"5e61d941-9458-4dd0-9468-f977b67c922c","Type":"ContainerDied","Data":"bad4e54e311d51a434c7888485ddf4daa06a067b8c79f44fcaaf65859bc7c861"} Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.621661 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.693981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-nb\") pod \"5e61d941-9458-4dd0-9468-f977b67c922c\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.694027 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-sb\") pod \"5e61d941-9458-4dd0-9468-f977b67c922c\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.694048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-dns-svc\") pod \"5e61d941-9458-4dd0-9468-f977b67c922c\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.694065 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-config\") pod \"5e61d941-9458-4dd0-9468-f977b67c922c\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.694111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rw2d\" (UniqueName: \"kubernetes.io/projected/5e61d941-9458-4dd0-9468-f977b67c922c-kube-api-access-8rw2d\") pod \"5e61d941-9458-4dd0-9468-f977b67c922c\" (UID: \"5e61d941-9458-4dd0-9468-f977b67c922c\") " Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.714432 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e61d941-9458-4dd0-9468-f977b67c922c-kube-api-access-8rw2d" (OuterVolumeSpecName: "kube-api-access-8rw2d") pod "5e61d941-9458-4dd0-9468-f977b67c922c" (UID: "5e61d941-9458-4dd0-9468-f977b67c922c"). InnerVolumeSpecName "kube-api-access-8rw2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.738100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-config" (OuterVolumeSpecName: "config") pod "5e61d941-9458-4dd0-9468-f977b67c922c" (UID: "5e61d941-9458-4dd0-9468-f977b67c922c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.746051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e61d941-9458-4dd0-9468-f977b67c922c" (UID: "5e61d941-9458-4dd0-9468-f977b67c922c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.746227 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e61d941-9458-4dd0-9468-f977b67c922c" (UID: "5e61d941-9458-4dd0-9468-f977b67c922c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.763287 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e61d941-9458-4dd0-9468-f977b67c922c" (UID: "5e61d941-9458-4dd0-9468-f977b67c922c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.796039 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.796066 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.796076 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.796084 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e61d941-9458-4dd0-9468-f977b67c922c-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:28 crc kubenswrapper[4749]: I1129 02:43:28.796095 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rw2d\" (UniqueName: \"kubernetes.io/projected/5e61d941-9458-4dd0-9468-f977b67c922c-kube-api-access-8rw2d\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:29 crc kubenswrapper[4749]: I1129 02:43:29.515668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" event={"ID":"5e61d941-9458-4dd0-9468-f977b67c922c","Type":"ContainerDied","Data":"2dd70d5a3250979e0a56098bd239e2ac955b5221139da16cb4afa3256fe96e32"} Nov 29 02:43:29 crc kubenswrapper[4749]: I1129 02:43:29.515977 4749 scope.go:117] "RemoveContainer" containerID="bad4e54e311d51a434c7888485ddf4daa06a067b8c79f44fcaaf65859bc7c861" Nov 29 02:43:29 crc kubenswrapper[4749]: I1129 02:43:29.516100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869b8c7567-wnls6" Nov 29 02:43:29 crc kubenswrapper[4749]: I1129 02:43:29.539608 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869b8c7567-wnls6"] Nov 29 02:43:29 crc kubenswrapper[4749]: I1129 02:43:29.544078 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869b8c7567-wnls6"] Nov 29 02:43:29 crc kubenswrapper[4749]: I1129 02:43:29.545380 4749 scope.go:117] "RemoveContainer" containerID="ff9b182d95f581343ae1b70666dca164862b676c7735185c82be771074d2c164" Nov 29 02:43:31 crc kubenswrapper[4749]: I1129 02:43:31.089849 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e61d941-9458-4dd0-9468-f977b67c922c" path="/var/lib/kubelet/pods/5e61d941-9458-4dd0-9468-f977b67c922c/volumes" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.285234 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qqwz7"] Nov 29 02:43:39 crc kubenswrapper[4749]: E1129 02:43:39.286141 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e61d941-9458-4dd0-9468-f977b67c922c" containerName="init" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.286155 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e61d941-9458-4dd0-9468-f977b67c922c" containerName="init" Nov 29 02:43:39 crc kubenswrapper[4749]: E1129 02:43:39.286176 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e61d941-9458-4dd0-9468-f977b67c922c" containerName="dnsmasq-dns" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.286181 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e61d941-9458-4dd0-9468-f977b67c922c" containerName="dnsmasq-dns" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.286360 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e61d941-9458-4dd0-9468-f977b67c922c" containerName="dnsmasq-dns" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.287041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.301033 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qqwz7"] Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.383079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-operator-scripts\") pod \"neutron-db-create-qqwz7\" (UID: \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\") " pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.383157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lprhh\" (UniqueName: \"kubernetes.io/projected/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-kube-api-access-lprhh\") pod \"neutron-db-create-qqwz7\" (UID: \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\") " pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.386624 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b359-account-create-update-zpb5q"] Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.387760 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.389865 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.394723 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b359-account-create-update-zpb5q"] Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.484893 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lprhh\" (UniqueName: \"kubernetes.io/projected/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-kube-api-access-lprhh\") pod \"neutron-db-create-qqwz7\" (UID: \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\") " pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.484931 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdpwr\" (UniqueName: \"kubernetes.io/projected/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-kube-api-access-wdpwr\") pod \"neutron-b359-account-create-update-zpb5q\" (UID: \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\") " pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.484978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-operator-scripts\") pod \"neutron-b359-account-create-update-zpb5q\" (UID: \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\") " pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.485054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-operator-scripts\") pod \"neutron-db-create-qqwz7\" (UID: \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\") " pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.485648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-operator-scripts\") pod \"neutron-db-create-qqwz7\" (UID: \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\") " pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.507674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lprhh\" (UniqueName: \"kubernetes.io/projected/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-kube-api-access-lprhh\") pod \"neutron-db-create-qqwz7\" (UID: \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\") " pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.586404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdpwr\" (UniqueName: \"kubernetes.io/projected/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-kube-api-access-wdpwr\") pod \"neutron-b359-account-create-update-zpb5q\" (UID: \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\") " pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.586479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-operator-scripts\") pod \"neutron-b359-account-create-update-zpb5q\" (UID: \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\") " pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.587325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-operator-scripts\") pod \"neutron-b359-account-create-update-zpb5q\" (UID: \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\") " pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.605317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.612421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdpwr\" (UniqueName: \"kubernetes.io/projected/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-kube-api-access-wdpwr\") pod \"neutron-b359-account-create-update-zpb5q\" (UID: \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\") " pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:39 crc kubenswrapper[4749]: I1129 02:43:39.712580 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:40 crc kubenswrapper[4749]: I1129 02:43:40.073868 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qqwz7"] Nov 29 02:43:40 crc kubenswrapper[4749]: I1129 02:43:40.223887 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b359-account-create-update-zpb5q"] Nov 29 02:43:40 crc kubenswrapper[4749]: W1129 02:43:40.229942 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f5007a_c36c_49c8_9be8_cb3ffe01bb03.slice/crio-7125a2d51f0128c95565b19b0f3e6e32162ca764894d38d6787aad16ef3c42b7 WatchSource:0}: Error finding container 7125a2d51f0128c95565b19b0f3e6e32162ca764894d38d6787aad16ef3c42b7: Status 404 returned error can't find the container with id 7125a2d51f0128c95565b19b0f3e6e32162ca764894d38d6787aad16ef3c42b7 Nov 29 02:43:40 crc kubenswrapper[4749]: I1129 02:43:40.627040 4749 generic.go:334] "Generic (PLEG): container finished" podID="a6f5007a-c36c-49c8-9be8-cb3ffe01bb03" containerID="a4631435c9aeaacf2ca5b3f8179670453fd4edc6a8f1d7e9d4d36ef8d8808c7b" exitCode=0 Nov 29 02:43:40 crc kubenswrapper[4749]: I1129 02:43:40.627158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b359-account-create-update-zpb5q" event={"ID":"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03","Type":"ContainerDied","Data":"a4631435c9aeaacf2ca5b3f8179670453fd4edc6a8f1d7e9d4d36ef8d8808c7b"} Nov 29 02:43:40 crc kubenswrapper[4749]: I1129 02:43:40.627208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b359-account-create-update-zpb5q" event={"ID":"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03","Type":"ContainerStarted","Data":"7125a2d51f0128c95565b19b0f3e6e32162ca764894d38d6787aad16ef3c42b7"} Nov 29 02:43:40 crc kubenswrapper[4749]: I1129 02:43:40.628885 4749 generic.go:334] "Generic (PLEG): container finished" podID="17fd7c23-34d8-447d-86e4-f3cb7a49f7ed" containerID="a2021300eb0e5267251914a7b74251631fdb371ca3421dfa0860ed477adeacc6" exitCode=0 Nov 29 02:43:40 crc kubenswrapper[4749]: I1129 02:43:40.628925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqwz7" event={"ID":"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed","Type":"ContainerDied","Data":"a2021300eb0e5267251914a7b74251631fdb371ca3421dfa0860ed477adeacc6"} Nov 29 02:43:40 crc kubenswrapper[4749]: I1129 02:43:40.628990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqwz7" event={"ID":"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed","Type":"ContainerStarted","Data":"8971e366442f91735cd34aa363ff05b78ed18ba2d3283b18eacc3cfc5b223b12"} Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.488759 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.493822 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.559709 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lprhh\" (UniqueName: \"kubernetes.io/projected/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-kube-api-access-lprhh\") pod \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\" (UID: \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\") " Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.560031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-operator-scripts\") pod \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\" (UID: \"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed\") " Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.560133 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-operator-scripts\") pod \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\" (UID: \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\") " Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.560350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17fd7c23-34d8-447d-86e4-f3cb7a49f7ed" (UID: "17fd7c23-34d8-447d-86e4-f3cb7a49f7ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.560553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdpwr\" (UniqueName: \"kubernetes.io/projected/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-kube-api-access-wdpwr\") pod \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\" (UID: \"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03\") " Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.560707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6f5007a-c36c-49c8-9be8-cb3ffe01bb03" (UID: "a6f5007a-c36c-49c8-9be8-cb3ffe01bb03"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.561126 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.561289 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.566051 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-kube-api-access-lprhh" (OuterVolumeSpecName: "kube-api-access-lprhh") pod "17fd7c23-34d8-447d-86e4-f3cb7a49f7ed" (UID: "17fd7c23-34d8-447d-86e4-f3cb7a49f7ed"). InnerVolumeSpecName "kube-api-access-lprhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.566061 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-kube-api-access-wdpwr" (OuterVolumeSpecName: "kube-api-access-wdpwr") pod "a6f5007a-c36c-49c8-9be8-cb3ffe01bb03" (UID: "a6f5007a-c36c-49c8-9be8-cb3ffe01bb03"). InnerVolumeSpecName "kube-api-access-wdpwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.651829 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b359-account-create-update-zpb5q" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.651825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b359-account-create-update-zpb5q" event={"ID":"a6f5007a-c36c-49c8-9be8-cb3ffe01bb03","Type":"ContainerDied","Data":"7125a2d51f0128c95565b19b0f3e6e32162ca764894d38d6787aad16ef3c42b7"} Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.651914 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7125a2d51f0128c95565b19b0f3e6e32162ca764894d38d6787aad16ef3c42b7" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.655013 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqwz7" event={"ID":"17fd7c23-34d8-447d-86e4-f3cb7a49f7ed","Type":"ContainerDied","Data":"8971e366442f91735cd34aa363ff05b78ed18ba2d3283b18eacc3cfc5b223b12"} Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.655041 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqwz7" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.655044 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8971e366442f91735cd34aa363ff05b78ed18ba2d3283b18eacc3cfc5b223b12" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.663593 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdpwr\" (UniqueName: \"kubernetes.io/projected/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03-kube-api-access-wdpwr\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:42 crc kubenswrapper[4749]: I1129 02:43:42.663641 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lprhh\" (UniqueName: \"kubernetes.io/projected/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed-kube-api-access-lprhh\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.571592 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-js449"] Nov 29 02:43:44 crc kubenswrapper[4749]: E1129 02:43:44.572242 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fd7c23-34d8-447d-86e4-f3cb7a49f7ed" containerName="mariadb-database-create" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.572257 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fd7c23-34d8-447d-86e4-f3cb7a49f7ed" containerName="mariadb-database-create" Nov 29 02:43:44 crc kubenswrapper[4749]: E1129 02:43:44.572274 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f5007a-c36c-49c8-9be8-cb3ffe01bb03" containerName="mariadb-account-create-update" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.572282 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f5007a-c36c-49c8-9be8-cb3ffe01bb03" containerName="mariadb-account-create-update" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.572482 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="17fd7c23-34d8-447d-86e4-f3cb7a49f7ed" containerName="mariadb-database-create" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.572509 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f5007a-c36c-49c8-9be8-cb3ffe01bb03" containerName="mariadb-account-create-update" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.573153 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.580152 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jx9c8" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.580319 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.580610 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.585478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-js449"] Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.700540 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqr2l\" (UniqueName: \"kubernetes.io/projected/98455d99-8cf1-4065-a59b-28454051102b-kube-api-access-mqr2l\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.700888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-combined-ca-bundle\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.701134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-config\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.803093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-config\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.803166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqr2l\" (UniqueName: \"kubernetes.io/projected/98455d99-8cf1-4065-a59b-28454051102b-kube-api-access-mqr2l\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.803248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-combined-ca-bundle\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.814901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-config\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.820444 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-combined-ca-bundle\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.830486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqr2l\" (UniqueName: \"kubernetes.io/projected/98455d99-8cf1-4065-a59b-28454051102b-kube-api-access-mqr2l\") pod \"neutron-db-sync-js449\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " pod="openstack/neutron-db-sync-js449" Nov 29 02:43:44 crc kubenswrapper[4749]: I1129 02:43:44.896624 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-js449" Nov 29 02:43:45 crc kubenswrapper[4749]: I1129 02:43:45.401970 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-js449"] Nov 29 02:43:45 crc kubenswrapper[4749]: W1129 02:43:45.405217 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98455d99_8cf1_4065_a59b_28454051102b.slice/crio-0ad2f951a9eda6e50e8fc65ec77706bbe2daacce607dab46d4d44d929bc97d35 WatchSource:0}: Error finding container 0ad2f951a9eda6e50e8fc65ec77706bbe2daacce607dab46d4d44d929bc97d35: Status 404 returned error can't find the container with id 0ad2f951a9eda6e50e8fc65ec77706bbe2daacce607dab46d4d44d929bc97d35 Nov 29 02:43:45 crc kubenswrapper[4749]: I1129 02:43:45.704020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-js449" event={"ID":"98455d99-8cf1-4065-a59b-28454051102b","Type":"ContainerStarted","Data":"b63741453603aecbda87cee7cfd9bcf6743ef5a280be203156ebb303cba77243"} Nov 29 02:43:45 crc kubenswrapper[4749]: I1129 02:43:45.704259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-js449" event={"ID":"98455d99-8cf1-4065-a59b-28454051102b","Type":"ContainerStarted","Data":"0ad2f951a9eda6e50e8fc65ec77706bbe2daacce607dab46d4d44d929bc97d35"} Nov 29 02:43:45 crc kubenswrapper[4749]: I1129 02:43:45.736367 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-js449" podStartSLOduration=1.736345688 podStartE2EDuration="1.736345688s" podCreationTimestamp="2025-11-29 02:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:45.727600715 +0000 UTC m=+5568.899750642" watchObservedRunningTime="2025-11-29 02:43:45.736345688 +0000 UTC m=+5568.908495555" Nov 29 02:43:49 crc kubenswrapper[4749]: I1129 02:43:49.749337 4749 generic.go:334] "Generic (PLEG): container finished" podID="98455d99-8cf1-4065-a59b-28454051102b" containerID="b63741453603aecbda87cee7cfd9bcf6743ef5a280be203156ebb303cba77243" exitCode=0 Nov 29 02:43:49 crc kubenswrapper[4749]: I1129 02:43:49.749482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-js449" event={"ID":"98455d99-8cf1-4065-a59b-28454051102b","Type":"ContainerDied","Data":"b63741453603aecbda87cee7cfd9bcf6743ef5a280be203156ebb303cba77243"} Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.069498 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-js449" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.245876 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqr2l\" (UniqueName: \"kubernetes.io/projected/98455d99-8cf1-4065-a59b-28454051102b-kube-api-access-mqr2l\") pod \"98455d99-8cf1-4065-a59b-28454051102b\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.246003 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-combined-ca-bundle\") pod \"98455d99-8cf1-4065-a59b-28454051102b\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.246141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-config\") pod \"98455d99-8cf1-4065-a59b-28454051102b\" (UID: \"98455d99-8cf1-4065-a59b-28454051102b\") " Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.258187 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98455d99-8cf1-4065-a59b-28454051102b-kube-api-access-mqr2l" (OuterVolumeSpecName: "kube-api-access-mqr2l") pod "98455d99-8cf1-4065-a59b-28454051102b" (UID: "98455d99-8cf1-4065-a59b-28454051102b"). InnerVolumeSpecName "kube-api-access-mqr2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.293054 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-config" (OuterVolumeSpecName: "config") pod "98455d99-8cf1-4065-a59b-28454051102b" (UID: "98455d99-8cf1-4065-a59b-28454051102b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.295303 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98455d99-8cf1-4065-a59b-28454051102b" (UID: "98455d99-8cf1-4065-a59b-28454051102b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.349747 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.349808 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqr2l\" (UniqueName: \"kubernetes.io/projected/98455d99-8cf1-4065-a59b-28454051102b-kube-api-access-mqr2l\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.349834 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98455d99-8cf1-4065-a59b-28454051102b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.765108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-js449" event={"ID":"98455d99-8cf1-4065-a59b-28454051102b","Type":"ContainerDied","Data":"0ad2f951a9eda6e50e8fc65ec77706bbe2daacce607dab46d4d44d929bc97d35"} Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.765166 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ad2f951a9eda6e50e8fc65ec77706bbe2daacce607dab46d4d44d929bc97d35" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.765165 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-js449" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.938846 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fb47545c-tw6kq"] Nov 29 02:43:51 crc kubenswrapper[4749]: E1129 02:43:51.939276 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98455d99-8cf1-4065-a59b-28454051102b" containerName="neutron-db-sync" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.939294 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="98455d99-8cf1-4065-a59b-28454051102b" containerName="neutron-db-sync" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.939552 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="98455d99-8cf1-4065-a59b-28454051102b" containerName="neutron-db-sync" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.940813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:51 crc kubenswrapper[4749]: I1129 02:43:51.964496 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fb47545c-tw6kq"] Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.059754 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-766b69ccd5-fhfd7"] Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.062948 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.064911 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.065017 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jx9c8" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.065041 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.070561 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-766b69ccd5-fhfd7"] Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.089981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-nb\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.090038 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlmkt\" (UniqueName: \"kubernetes.io/projected/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-kube-api-access-vlmkt\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.090102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-dns-svc\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.090145 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw7bf\" (UniqueName: \"kubernetes.io/projected/cf9ee58f-5bd3-42b4-9004-699db5c01c70-kube-api-access-xw7bf\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.090173 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-config\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.090276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-combined-ca-bundle\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.090377 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-httpd-config\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.090411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-config\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.090439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-sb\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-httpd-config\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191099 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-config\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191123 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-sb\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-nb\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191187 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlmkt\" (UniqueName: \"kubernetes.io/projected/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-kube-api-access-vlmkt\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191246 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-dns-svc\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw7bf\" (UniqueName: \"kubernetes.io/projected/cf9ee58f-5bd3-42b4-9004-699db5c01c70-kube-api-access-xw7bf\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191283 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-config\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.191299 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-combined-ca-bundle\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.192050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-nb\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.192237 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-config\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.192326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-sb\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.192405 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-dns-svc\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.195336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-combined-ca-bundle\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.196871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-config\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.200430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cf9ee58f-5bd3-42b4-9004-699db5c01c70-httpd-config\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.210874 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlmkt\" (UniqueName: \"kubernetes.io/projected/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-kube-api-access-vlmkt\") pod \"dnsmasq-dns-55fb47545c-tw6kq\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.210929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw7bf\" (UniqueName: \"kubernetes.io/projected/cf9ee58f-5bd3-42b4-9004-699db5c01c70-kube-api-access-xw7bf\") pod \"neutron-766b69ccd5-fhfd7\" (UID: \"cf9ee58f-5bd3-42b4-9004-699db5c01c70\") " pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.262968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.378613 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.770701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fb47545c-tw6kq"] Nov 29 02:43:52 crc kubenswrapper[4749]: W1129 02:43:52.782646 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0c5eae2_92c4_4eb1_86e6_ea728e456f00.slice/crio-c7ed947c8e09058ff2399f3edebdf226961e46ff8a3228c00a973f61d822c02c WatchSource:0}: Error finding container c7ed947c8e09058ff2399f3edebdf226961e46ff8a3228c00a973f61d822c02c: Status 404 returned error can't find the container with id c7ed947c8e09058ff2399f3edebdf226961e46ff8a3228c00a973f61d822c02c Nov 29 02:43:52 crc kubenswrapper[4749]: I1129 02:43:52.968365 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-766b69ccd5-fhfd7"] Nov 29 02:43:53 crc kubenswrapper[4749]: I1129 02:43:53.796259 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b69ccd5-fhfd7" event={"ID":"cf9ee58f-5bd3-42b4-9004-699db5c01c70","Type":"ContainerStarted","Data":"28eb51da742ea6ac94cb12a1b2bdd9a668c50cadb5f96c1142f32aeb45f0baf6"} Nov 29 02:43:53 crc kubenswrapper[4749]: I1129 02:43:53.796531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b69ccd5-fhfd7" event={"ID":"cf9ee58f-5bd3-42b4-9004-699db5c01c70","Type":"ContainerStarted","Data":"3d9e213e7a0062b62061027d3849656cb60b93707c95b2f1053686cb32c48c62"} Nov 29 02:43:53 crc kubenswrapper[4749]: I1129 02:43:53.796541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-766b69ccd5-fhfd7" event={"ID":"cf9ee58f-5bd3-42b4-9004-699db5c01c70","Type":"ContainerStarted","Data":"a94bcfe216bcbfa72d2103c9cb682946a62280038adb18851eca711ce4871bd6"} Nov 29 02:43:53 crc kubenswrapper[4749]: I1129 02:43:53.796784 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:43:53 crc kubenswrapper[4749]: I1129 02:43:53.797875 4749 generic.go:334] "Generic (PLEG): container finished" podID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" containerID="7190fdd43a19bf7e1da031ac07fe4e2a9e41a01fa38d2a073d6dd37c7bc43e70" exitCode=0 Nov 29 02:43:53 crc kubenswrapper[4749]: I1129 02:43:53.797909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" event={"ID":"d0c5eae2-92c4-4eb1-86e6-ea728e456f00","Type":"ContainerDied","Data":"7190fdd43a19bf7e1da031ac07fe4e2a9e41a01fa38d2a073d6dd37c7bc43e70"} Nov 29 02:43:53 crc kubenswrapper[4749]: I1129 02:43:53.797944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" event={"ID":"d0c5eae2-92c4-4eb1-86e6-ea728e456f00","Type":"ContainerStarted","Data":"c7ed947c8e09058ff2399f3edebdf226961e46ff8a3228c00a973f61d822c02c"} Nov 29 02:43:53 crc kubenswrapper[4749]: I1129 02:43:53.821413 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-766b69ccd5-fhfd7" podStartSLOduration=1.821396215 podStartE2EDuration="1.821396215s" podCreationTimestamp="2025-11-29 02:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:53.819005387 +0000 UTC m=+5576.991155244" watchObservedRunningTime="2025-11-29 02:43:53.821396215 +0000 UTC m=+5576.993546072" Nov 29 02:43:54 crc kubenswrapper[4749]: I1129 02:43:54.814093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" event={"ID":"d0c5eae2-92c4-4eb1-86e6-ea728e456f00","Type":"ContainerStarted","Data":"78fdae179267d59b9cc2ecc695311fb4612db89834dbb136a24abf186878bd55"} Nov 29 02:43:54 crc kubenswrapper[4749]: I1129 02:43:54.814866 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.264569 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.285402 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" podStartSLOduration=11.285381985 podStartE2EDuration="11.285381985s" podCreationTimestamp="2025-11-29 02:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:43:54.841694784 +0000 UTC m=+5578.013844681" watchObservedRunningTime="2025-11-29 02:44:02.285381985 +0000 UTC m=+5585.457531842" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.320529 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d9f4c8cf-jrgc6"] Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.321093 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" podUID="a0e717dd-6190-4504-8d53-fd190b2112ab" containerName="dnsmasq-dns" containerID="cri-o://7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488" gracePeriod=10 Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.914692 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.926900 4749 generic.go:334] "Generic (PLEG): container finished" podID="a0e717dd-6190-4504-8d53-fd190b2112ab" containerID="7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488" exitCode=0 Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.926956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" event={"ID":"a0e717dd-6190-4504-8d53-fd190b2112ab","Type":"ContainerDied","Data":"7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488"} Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.926993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" event={"ID":"a0e717dd-6190-4504-8d53-fd190b2112ab","Type":"ContainerDied","Data":"4a7a13c34a8b808f79513390209d2fda5b51f12bb9916c0cbd2ecf02128e440e"} Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.927018 4749 scope.go:117] "RemoveContainer" containerID="7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.927454 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d9f4c8cf-jrgc6" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.955564 4749 scope.go:117] "RemoveContainer" containerID="b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.986015 4749 scope.go:117] "RemoveContainer" containerID="7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488" Nov 29 02:44:02 crc kubenswrapper[4749]: E1129 02:44:02.986503 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488\": container with ID starting with 7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488 not found: ID does not exist" containerID="7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.986538 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488"} err="failed to get container status \"7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488\": rpc error: code = NotFound desc = could not find container \"7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488\": container with ID starting with 7dd5c90d889cdfac541cc55f2a54b736a2055966d20bfaf53f8073ed8d210488 not found: ID does not exist" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.986563 4749 scope.go:117] "RemoveContainer" containerID="b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17" Nov 29 02:44:02 crc kubenswrapper[4749]: E1129 02:44:02.986961 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17\": container with ID starting with b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17 not found: ID does not exist" containerID="b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.987090 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17"} err="failed to get container status \"b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17\": rpc error: code = NotFound desc = could not find container \"b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17\": container with ID starting with b24462ff43669a8b4a6f7382800da075b661600055616cb25aa72db556852c17 not found: ID does not exist" Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.998936 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-257ns\" (UniqueName: \"kubernetes.io/projected/a0e717dd-6190-4504-8d53-fd190b2112ab-kube-api-access-257ns\") pod \"a0e717dd-6190-4504-8d53-fd190b2112ab\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.999021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-config\") pod \"a0e717dd-6190-4504-8d53-fd190b2112ab\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.999060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-dns-svc\") pod \"a0e717dd-6190-4504-8d53-fd190b2112ab\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.999315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-sb\") pod \"a0e717dd-6190-4504-8d53-fd190b2112ab\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " Nov 29 02:44:02 crc kubenswrapper[4749]: I1129 02:44:02.999862 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-nb\") pod \"a0e717dd-6190-4504-8d53-fd190b2112ab\" (UID: \"a0e717dd-6190-4504-8d53-fd190b2112ab\") " Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.007024 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e717dd-6190-4504-8d53-fd190b2112ab-kube-api-access-257ns" (OuterVolumeSpecName: "kube-api-access-257ns") pod "a0e717dd-6190-4504-8d53-fd190b2112ab" (UID: "a0e717dd-6190-4504-8d53-fd190b2112ab"). InnerVolumeSpecName "kube-api-access-257ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.056464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0e717dd-6190-4504-8d53-fd190b2112ab" (UID: "a0e717dd-6190-4504-8d53-fd190b2112ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.061421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-config" (OuterVolumeSpecName: "config") pod "a0e717dd-6190-4504-8d53-fd190b2112ab" (UID: "a0e717dd-6190-4504-8d53-fd190b2112ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.067755 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a0e717dd-6190-4504-8d53-fd190b2112ab" (UID: "a0e717dd-6190-4504-8d53-fd190b2112ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.084634 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a0e717dd-6190-4504-8d53-fd190b2112ab" (UID: "a0e717dd-6190-4504-8d53-fd190b2112ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.101554 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.101583 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.101598 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-257ns\" (UniqueName: \"kubernetes.io/projected/a0e717dd-6190-4504-8d53-fd190b2112ab-kube-api-access-257ns\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.101611 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.101623 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0e717dd-6190-4504-8d53-fd190b2112ab-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.252806 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d9f4c8cf-jrgc6"] Nov 29 02:44:03 crc kubenswrapper[4749]: I1129 02:44:03.265215 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d9f4c8cf-jrgc6"] Nov 29 02:44:05 crc kubenswrapper[4749]: I1129 02:44:05.094490 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e717dd-6190-4504-8d53-fd190b2112ab" path="/var/lib/kubelet/pods/a0e717dd-6190-4504-8d53-fd190b2112ab/volumes" Nov 29 02:44:22 crc kubenswrapper[4749]: I1129 02:44:22.463888 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-766b69ccd5-fhfd7" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.053381 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pjk59"] Nov 29 02:44:30 crc kubenswrapper[4749]: E1129 02:44:30.054312 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e717dd-6190-4504-8d53-fd190b2112ab" containerName="init" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.054327 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e717dd-6190-4504-8d53-fd190b2112ab" containerName="init" Nov 29 02:44:30 crc kubenswrapper[4749]: E1129 02:44:30.054339 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e717dd-6190-4504-8d53-fd190b2112ab" containerName="dnsmasq-dns" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.054347 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e717dd-6190-4504-8d53-fd190b2112ab" containerName="dnsmasq-dns" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.054565 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e717dd-6190-4504-8d53-fd190b2112ab" containerName="dnsmasq-dns" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.055224 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pjk59" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.066359 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pjk59"] Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.134092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/794dc0b7-98bd-4b0b-8c79-19523f9057b9-operator-scripts\") pod \"glance-db-create-pjk59\" (UID: \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\") " pod="openstack/glance-db-create-pjk59" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.134240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8dpg\" (UniqueName: \"kubernetes.io/projected/794dc0b7-98bd-4b0b-8c79-19523f9057b9-kube-api-access-f8dpg\") pod \"glance-db-create-pjk59\" (UID: \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\") " pod="openstack/glance-db-create-pjk59" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.148785 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2eba-account-create-update-jxl2n"] Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.149865 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.151895 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.173233 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2eba-account-create-update-jxl2n"] Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.236164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8dpg\" (UniqueName: \"kubernetes.io/projected/794dc0b7-98bd-4b0b-8c79-19523f9057b9-kube-api-access-f8dpg\") pod \"glance-db-create-pjk59\" (UID: \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\") " pod="openstack/glance-db-create-pjk59" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.236239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czftd\" (UniqueName: \"kubernetes.io/projected/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-kube-api-access-czftd\") pod \"glance-2eba-account-create-update-jxl2n\" (UID: \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\") " pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.236356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/794dc0b7-98bd-4b0b-8c79-19523f9057b9-operator-scripts\") pod \"glance-db-create-pjk59\" (UID: \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\") " pod="openstack/glance-db-create-pjk59" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.236402 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-operator-scripts\") pod \"glance-2eba-account-create-update-jxl2n\" (UID: \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\") " pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.237381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/794dc0b7-98bd-4b0b-8c79-19523f9057b9-operator-scripts\") pod \"glance-db-create-pjk59\" (UID: \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\") " pod="openstack/glance-db-create-pjk59" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.261942 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8dpg\" (UniqueName: \"kubernetes.io/projected/794dc0b7-98bd-4b0b-8c79-19523f9057b9-kube-api-access-f8dpg\") pod \"glance-db-create-pjk59\" (UID: \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\") " pod="openstack/glance-db-create-pjk59" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.337934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czftd\" (UniqueName: \"kubernetes.io/projected/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-kube-api-access-czftd\") pod \"glance-2eba-account-create-update-jxl2n\" (UID: \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\") " pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.338418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-operator-scripts\") pod \"glance-2eba-account-create-update-jxl2n\" (UID: \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\") " pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.339153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-operator-scripts\") pod \"glance-2eba-account-create-update-jxl2n\" (UID: \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\") " pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.353122 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czftd\" (UniqueName: \"kubernetes.io/projected/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-kube-api-access-czftd\") pod \"glance-2eba-account-create-update-jxl2n\" (UID: \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\") " pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.383472 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pjk59" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.476865 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.828865 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pjk59"] Nov 29 02:44:30 crc kubenswrapper[4749]: W1129 02:44:30.952878 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7992f3f8_703c_4b6b_8ef1_4067d9972fc6.slice/crio-7c465d6ab50a6a0c308aed793132c1c1c762ceca8c10de9c7a6333401715b046 WatchSource:0}: Error finding container 7c465d6ab50a6a0c308aed793132c1c1c762ceca8c10de9c7a6333401715b046: Status 404 returned error can't find the container with id 7c465d6ab50a6a0c308aed793132c1c1c762ceca8c10de9c7a6333401715b046 Nov 29 02:44:30 crc kubenswrapper[4749]: I1129 02:44:30.956484 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2eba-account-create-update-jxl2n"] Nov 29 02:44:31 crc kubenswrapper[4749]: I1129 02:44:31.193839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2eba-account-create-update-jxl2n" event={"ID":"7992f3f8-703c-4b6b-8ef1-4067d9972fc6","Type":"ContainerStarted","Data":"ec1124e6ca2676848d0bc99bf74118baa8c8c5569dd02d33d4de4b3b845de850"} Nov 29 02:44:31 crc kubenswrapper[4749]: I1129 02:44:31.194283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2eba-account-create-update-jxl2n" event={"ID":"7992f3f8-703c-4b6b-8ef1-4067d9972fc6","Type":"ContainerStarted","Data":"7c465d6ab50a6a0c308aed793132c1c1c762ceca8c10de9c7a6333401715b046"} Nov 29 02:44:31 crc kubenswrapper[4749]: I1129 02:44:31.196449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pjk59" event={"ID":"794dc0b7-98bd-4b0b-8c79-19523f9057b9","Type":"ContainerStarted","Data":"258fc7cccacaab4e83383ea19c59acaac2e0bdf117b2cb2c4b6c94702178cf19"} Nov 29 02:44:31 crc kubenswrapper[4749]: I1129 02:44:31.196493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pjk59" event={"ID":"794dc0b7-98bd-4b0b-8c79-19523f9057b9","Type":"ContainerStarted","Data":"d9ddc53b0b33b9c593760de6528bd117933bc3147e5d368fe706d9bc97f88601"} Nov 29 02:44:31 crc kubenswrapper[4749]: I1129 02:44:31.218593 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-2eba-account-create-update-jxl2n" podStartSLOduration=1.218555611 podStartE2EDuration="1.218555611s" podCreationTimestamp="2025-11-29 02:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:44:31.206654381 +0000 UTC m=+5614.378804258" watchObservedRunningTime="2025-11-29 02:44:31.218555611 +0000 UTC m=+5614.390705548" Nov 29 02:44:31 crc kubenswrapper[4749]: I1129 02:44:31.228587 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-pjk59" podStartSLOduration=1.228557644 podStartE2EDuration="1.228557644s" podCreationTimestamp="2025-11-29 02:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:44:31.218218403 +0000 UTC m=+5614.390368290" watchObservedRunningTime="2025-11-29 02:44:31.228557644 +0000 UTC m=+5614.400707581" Nov 29 02:44:32 crc kubenswrapper[4749]: I1129 02:44:32.216367 4749 generic.go:334] "Generic (PLEG): container finished" podID="794dc0b7-98bd-4b0b-8c79-19523f9057b9" containerID="258fc7cccacaab4e83383ea19c59acaac2e0bdf117b2cb2c4b6c94702178cf19" exitCode=0 Nov 29 02:44:32 crc kubenswrapper[4749]: I1129 02:44:32.216429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pjk59" event={"ID":"794dc0b7-98bd-4b0b-8c79-19523f9057b9","Type":"ContainerDied","Data":"258fc7cccacaab4e83383ea19c59acaac2e0bdf117b2cb2c4b6c94702178cf19"} Nov 29 02:44:32 crc kubenswrapper[4749]: I1129 02:44:32.223058 4749 generic.go:334] "Generic (PLEG): container finished" podID="7992f3f8-703c-4b6b-8ef1-4067d9972fc6" containerID="ec1124e6ca2676848d0bc99bf74118baa8c8c5569dd02d33d4de4b3b845de850" exitCode=0 Nov 29 02:44:32 crc kubenswrapper[4749]: I1129 02:44:32.223110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2eba-account-create-update-jxl2n" event={"ID":"7992f3f8-703c-4b6b-8ef1-4067d9972fc6","Type":"ContainerDied","Data":"ec1124e6ca2676848d0bc99bf74118baa8c8c5569dd02d33d4de4b3b845de850"} Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.697961 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pjk59" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.703656 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.822593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8dpg\" (UniqueName: \"kubernetes.io/projected/794dc0b7-98bd-4b0b-8c79-19523f9057b9-kube-api-access-f8dpg\") pod \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\" (UID: \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\") " Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.822691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-operator-scripts\") pod \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\" (UID: \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\") " Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.822829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/794dc0b7-98bd-4b0b-8c79-19523f9057b9-operator-scripts\") pod \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\" (UID: \"794dc0b7-98bd-4b0b-8c79-19523f9057b9\") " Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.822906 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czftd\" (UniqueName: \"kubernetes.io/projected/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-kube-api-access-czftd\") pod \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\" (UID: \"7992f3f8-703c-4b6b-8ef1-4067d9972fc6\") " Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.823890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7992f3f8-703c-4b6b-8ef1-4067d9972fc6" (UID: "7992f3f8-703c-4b6b-8ef1-4067d9972fc6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.824311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/794dc0b7-98bd-4b0b-8c79-19523f9057b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "794dc0b7-98bd-4b0b-8c79-19523f9057b9" (UID: "794dc0b7-98bd-4b0b-8c79-19523f9057b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.829086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-kube-api-access-czftd" (OuterVolumeSpecName: "kube-api-access-czftd") pod "7992f3f8-703c-4b6b-8ef1-4067d9972fc6" (UID: "7992f3f8-703c-4b6b-8ef1-4067d9972fc6"). InnerVolumeSpecName "kube-api-access-czftd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.831609 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794dc0b7-98bd-4b0b-8c79-19523f9057b9-kube-api-access-f8dpg" (OuterVolumeSpecName: "kube-api-access-f8dpg") pod "794dc0b7-98bd-4b0b-8c79-19523f9057b9" (UID: "794dc0b7-98bd-4b0b-8c79-19523f9057b9"). InnerVolumeSpecName "kube-api-access-f8dpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.924878 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8dpg\" (UniqueName: \"kubernetes.io/projected/794dc0b7-98bd-4b0b-8c79-19523f9057b9-kube-api-access-f8dpg\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.924919 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.924931 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/794dc0b7-98bd-4b0b-8c79-19523f9057b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:33 crc kubenswrapper[4749]: I1129 02:44:33.924952 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czftd\" (UniqueName: \"kubernetes.io/projected/7992f3f8-703c-4b6b-8ef1-4067d9972fc6-kube-api-access-czftd\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:34 crc kubenswrapper[4749]: I1129 02:44:34.265084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pjk59" event={"ID":"794dc0b7-98bd-4b0b-8c79-19523f9057b9","Type":"ContainerDied","Data":"d9ddc53b0b33b9c593760de6528bd117933bc3147e5d368fe706d9bc97f88601"} Nov 29 02:44:34 crc kubenswrapper[4749]: I1129 02:44:34.265149 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9ddc53b0b33b9c593760de6528bd117933bc3147e5d368fe706d9bc97f88601" Nov 29 02:44:34 crc kubenswrapper[4749]: I1129 02:44:34.265101 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pjk59" Nov 29 02:44:34 crc kubenswrapper[4749]: I1129 02:44:34.271132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2eba-account-create-update-jxl2n" event={"ID":"7992f3f8-703c-4b6b-8ef1-4067d9972fc6","Type":"ContainerDied","Data":"7c465d6ab50a6a0c308aed793132c1c1c762ceca8c10de9c7a6333401715b046"} Nov 29 02:44:34 crc kubenswrapper[4749]: I1129 02:44:34.271192 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c465d6ab50a6a0c308aed793132c1c1c762ceca8c10de9c7a6333401715b046" Nov 29 02:44:34 crc kubenswrapper[4749]: I1129 02:44:34.271301 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2eba-account-create-update-jxl2n" Nov 29 02:44:34 crc kubenswrapper[4749]: E1129 02:44:34.430078 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7992f3f8_703c_4b6b_8ef1_4067d9972fc6.slice\": RecentStats: unable to find data in memory cache]" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.265787 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-2mvvp"] Nov 29 02:44:35 crc kubenswrapper[4749]: E1129 02:44:35.267559 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7992f3f8-703c-4b6b-8ef1-4067d9972fc6" containerName="mariadb-account-create-update" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.267640 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7992f3f8-703c-4b6b-8ef1-4067d9972fc6" containerName="mariadb-account-create-update" Nov 29 02:44:35 crc kubenswrapper[4749]: E1129 02:44:35.268017 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794dc0b7-98bd-4b0b-8c79-19523f9057b9" containerName="mariadb-database-create" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.268075 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="794dc0b7-98bd-4b0b-8c79-19523f9057b9" containerName="mariadb-database-create" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.268303 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7992f3f8-703c-4b6b-8ef1-4067d9972fc6" containerName="mariadb-account-create-update" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.268391 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="794dc0b7-98bd-4b0b-8c79-19523f9057b9" containerName="mariadb-database-create" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.269023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.271375 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.272481 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2cth6" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.276590 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2mvvp"] Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.354645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-db-sync-config-data\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.354731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fmff\" (UniqueName: \"kubernetes.io/projected/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-kube-api-access-4fmff\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.354893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-config-data\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.354952 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-combined-ca-bundle\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.456805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-db-sync-config-data\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.456876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fmff\" (UniqueName: \"kubernetes.io/projected/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-kube-api-access-4fmff\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.456947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-config-data\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.456983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-combined-ca-bundle\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.467009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-combined-ca-bundle\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.467129 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-config-data\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.480635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-db-sync-config-data\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.483441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fmff\" (UniqueName: \"kubernetes.io/projected/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-kube-api-access-4fmff\") pod \"glance-db-sync-2mvvp\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:35 crc kubenswrapper[4749]: I1129 02:44:35.619629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:36 crc kubenswrapper[4749]: I1129 02:44:36.104083 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2mvvp"] Nov 29 02:44:36 crc kubenswrapper[4749]: I1129 02:44:36.292395 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2mvvp" event={"ID":"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38","Type":"ContainerStarted","Data":"3b78c681f7b92caa5a89597d7a32d5c1f04f7a7aefedf56221c3851ae09c721b"} Nov 29 02:44:37 crc kubenswrapper[4749]: I1129 02:44:37.311315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2mvvp" event={"ID":"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38","Type":"ContainerStarted","Data":"1604cc0ca142cede583092b141edd470bd06006fd9234bfcc160f956b6c1fbe4"} Nov 29 02:44:37 crc kubenswrapper[4749]: I1129 02:44:37.345273 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-2mvvp" podStartSLOduration=2.345248881 podStartE2EDuration="2.345248881s" podCreationTimestamp="2025-11-29 02:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:44:37.337878672 +0000 UTC m=+5620.510028569" watchObservedRunningTime="2025-11-29 02:44:37.345248881 +0000 UTC m=+5620.517398778" Nov 29 02:44:40 crc kubenswrapper[4749]: I1129 02:44:40.346628 4749 generic.go:334] "Generic (PLEG): container finished" podID="fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" containerID="1604cc0ca142cede583092b141edd470bd06006fd9234bfcc160f956b6c1fbe4" exitCode=0 Nov 29 02:44:40 crc kubenswrapper[4749]: I1129 02:44:40.346724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2mvvp" event={"ID":"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38","Type":"ContainerDied","Data":"1604cc0ca142cede583092b141edd470bd06006fd9234bfcc160f956b6c1fbe4"} Nov 29 02:44:41 crc kubenswrapper[4749]: I1129 02:44:41.914734 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.114459 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-config-data\") pod \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.114559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-combined-ca-bundle\") pod \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.114635 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fmff\" (UniqueName: \"kubernetes.io/projected/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-kube-api-access-4fmff\") pod \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.114771 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-db-sync-config-data\") pod \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\" (UID: \"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38\") " Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.124336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-kube-api-access-4fmff" (OuterVolumeSpecName: "kube-api-access-4fmff") pod "fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" (UID: "fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38"). InnerVolumeSpecName "kube-api-access-4fmff". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.126486 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" (UID: "fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.165060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" (UID: "fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.200345 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-config-data" (OuterVolumeSpecName: "config-data") pod "fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" (UID: "fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.218464 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.218511 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.218533 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fmff\" (UniqueName: \"kubernetes.io/projected/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-kube-api-access-4fmff\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.218557 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.373632 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2mvvp" event={"ID":"fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38","Type":"ContainerDied","Data":"3b78c681f7b92caa5a89597d7a32d5c1f04f7a7aefedf56221c3851ae09c721b"} Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.374060 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b78c681f7b92caa5a89597d7a32d5c1f04f7a7aefedf56221c3851ae09c721b" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.373733 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2mvvp" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.712683 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:42 crc kubenswrapper[4749]: E1129 02:44:42.713092 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" containerName="glance-db-sync" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.713112 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" containerName="glance-db-sync" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.713356 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" containerName="glance-db-sync" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.714409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.717865 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.718047 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.718274 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2cth6" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.722095 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.727090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk8x9\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-kube-api-access-nk8x9\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.727130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-logs\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.727186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.727249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.727308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.727328 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.727369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-ceph\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.748620 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.819816 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58c6cd76c9-m22r4"] Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.821040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.831192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-ceph\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk8x9\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-kube-api-access-nk8x9\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-dns-svc\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-logs\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835538 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdtn\" (UniqueName: \"kubernetes.io/projected/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-kube-api-access-vgdtn\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835592 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-nb\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-config\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-sb\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.835725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.837689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-logs\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.840895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.855921 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58c6cd76c9-m22r4"] Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.859358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-ceph\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.861363 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-scripts\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.864181 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk8x9\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-kube-api-access-nk8x9\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.864277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-config-data\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.870523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.937139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdtn\" (UniqueName: \"kubernetes.io/projected/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-kube-api-access-vgdtn\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.937311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-nb\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.937347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-config\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.937368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-sb\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.937453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-dns-svc\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.938283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-config\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.938312 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-dns-svc\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.938743 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-sb\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.938873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-nb\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.958485 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.959745 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.960906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdtn\" (UniqueName: \"kubernetes.io/projected/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-kube-api-access-vgdtn\") pod \"dnsmasq-dns-58c6cd76c9-m22r4\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.968039 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 02:44:42 crc kubenswrapper[4749]: I1129 02:44:42.981470 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.038356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.039221 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.039274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.039298 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wtc\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-kube-api-access-j7wtc\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.039383 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.039422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.039459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-logs\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.039494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.141052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.141413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-logs\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.141467 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.141528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.141561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.141594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wtc\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-kube-api-access-j7wtc\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.141663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.141920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-logs\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.142009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.145895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.145930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.146058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.146397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.159483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wtc\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-kube-api-access-j7wtc\") pod \"glance-default-internal-api-0\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.210045 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.305163 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.581149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.646455 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58c6cd76c9-m22r4"] Nov 29 02:44:43 crc kubenswrapper[4749]: W1129 02:44:43.696245 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24fb78bb_6d28_4a42_ba1a_68342ec9b88d.slice/crio-ae150141fb1d47ff1adbac4b7de836a906f91ad259fb97f15a61c3764fe483d1 WatchSource:0}: Error finding container ae150141fb1d47ff1adbac4b7de836a906f91ad259fb97f15a61c3764fe483d1: Status 404 returned error can't find the container with id ae150141fb1d47ff1adbac4b7de836a906f91ad259fb97f15a61c3764fe483d1 Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.808675 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:43 crc kubenswrapper[4749]: W1129 02:44:43.829528 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0be554f6_5e6e_43a7_b3bb_cfa7868d5593.slice/crio-6293d92b8cfd39ebf966dc5fffb3d45466be9e0ba1671b9046b735953d46636a WatchSource:0}: Error finding container 6293d92b8cfd39ebf966dc5fffb3d45466be9e0ba1671b9046b735953d46636a: Status 404 returned error can't find the container with id 6293d92b8cfd39ebf966dc5fffb3d45466be9e0ba1671b9046b735953d46636a Nov 29 02:44:43 crc kubenswrapper[4749]: I1129 02:44:43.915350 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.416660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbfdf78-632b-48cd-9206-5f584d9b7a5c","Type":"ContainerStarted","Data":"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d"} Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.418356 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbfdf78-632b-48cd-9206-5f584d9b7a5c","Type":"ContainerStarted","Data":"5bd9c78dc05458ca7791be8ddf7ff7504bed15dc141ed6732a9e85a9e063b109"} Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.431469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0be554f6-5e6e-43a7-b3bb-cfa7868d5593","Type":"ContainerStarted","Data":"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6"} Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.431512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0be554f6-5e6e-43a7-b3bb-cfa7868d5593","Type":"ContainerStarted","Data":"6293d92b8cfd39ebf966dc5fffb3d45466be9e0ba1671b9046b735953d46636a"} Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.445343 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6zjx2"] Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.447176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.462114 4749 generic.go:334] "Generic (PLEG): container finished" podID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" containerID="c9c6d691a73ab05bbac6e567757d1b6693716804fb7141c5d76a9746938a3743" exitCode=0 Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.462217 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" event={"ID":"24fb78bb-6d28-4a42-ba1a-68342ec9b88d","Type":"ContainerDied","Data":"c9c6d691a73ab05bbac6e567757d1b6693716804fb7141c5d76a9746938a3743"} Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.462247 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" event={"ID":"24fb78bb-6d28-4a42-ba1a-68342ec9b88d","Type":"ContainerStarted","Data":"ae150141fb1d47ff1adbac4b7de836a906f91ad259fb97f15a61c3764fe483d1"} Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.472754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj5g4\" (UniqueName: \"kubernetes.io/projected/2d59da2c-dca3-4884-bf1b-fd309b199012-kube-api-access-tj5g4\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.472849 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-catalog-content\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.473663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-utilities\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.492919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zjx2"] Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.575993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-catalog-content\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.576113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-utilities\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.576360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj5g4\" (UniqueName: \"kubernetes.io/projected/2d59da2c-dca3-4884-bf1b-fd309b199012-kube-api-access-tj5g4\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.576484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-catalog-content\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.576589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-utilities\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.594455 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj5g4\" (UniqueName: \"kubernetes.io/projected/2d59da2c-dca3-4884-bf1b-fd309b199012-kube-api-access-tj5g4\") pod \"redhat-marketplace-6zjx2\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:44 crc kubenswrapper[4749]: I1129 02:44:44.788132 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:45 crc kubenswrapper[4749]: W1129 02:44:45.233495 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d59da2c_dca3_4884_bf1b_fd309b199012.slice/crio-529cd6c1fb1ddbbe0099371f008c97912de1b566cf7fd3173170e772e9ca734c WatchSource:0}: Error finding container 529cd6c1fb1ddbbe0099371f008c97912de1b566cf7fd3173170e772e9ca734c: Status 404 returned error can't find the container with id 529cd6c1fb1ddbbe0099371f008c97912de1b566cf7fd3173170e772e9ca734c Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.254218 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zjx2"] Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.489540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0be554f6-5e6e-43a7-b3bb-cfa7868d5593","Type":"ContainerStarted","Data":"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041"} Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.492558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" event={"ID":"24fb78bb-6d28-4a42-ba1a-68342ec9b88d","Type":"ContainerStarted","Data":"7ea7f37d0c27e25a7be2387221d54c678f088f26f5d12509f20db6e5450f16f9"} Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.492710 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.494011 4749 generic.go:334] "Generic (PLEG): container finished" podID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerID="cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6" exitCode=0 Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.494073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zjx2" event={"ID":"2d59da2c-dca3-4884-bf1b-fd309b199012","Type":"ContainerDied","Data":"cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6"} Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.494121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zjx2" event={"ID":"2d59da2c-dca3-4884-bf1b-fd309b199012","Type":"ContainerStarted","Data":"529cd6c1fb1ddbbe0099371f008c97912de1b566cf7fd3173170e772e9ca734c"} Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.495903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbfdf78-632b-48cd-9206-5f584d9b7a5c","Type":"ContainerStarted","Data":"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f"} Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.496027 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerName="glance-log" containerID="cri-o://62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d" gracePeriod=30 Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.496102 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerName="glance-httpd" containerID="cri-o://bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f" gracePeriod=30 Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.496350 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.514287 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.51427086 podStartE2EDuration="3.51427086s" podCreationTimestamp="2025-11-29 02:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:44:45.51303367 +0000 UTC m=+5628.685183567" watchObservedRunningTime="2025-11-29 02:44:45.51427086 +0000 UTC m=+5628.686420717" Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.550134 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" podStartSLOduration=3.550109002 podStartE2EDuration="3.550109002s" podCreationTimestamp="2025-11-29 02:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:44:45.540631071 +0000 UTC m=+5628.712780938" watchObservedRunningTime="2025-11-29 02:44:45.550109002 +0000 UTC m=+5628.722258879" Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.566266 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.5662493939999997 podStartE2EDuration="3.566249394s" podCreationTimestamp="2025-11-29 02:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:44:45.562335589 +0000 UTC m=+5628.734485456" watchObservedRunningTime="2025-11-29 02:44:45.566249394 +0000 UTC m=+5628.738399251" Nov 29 02:44:45 crc kubenswrapper[4749]: I1129 02:44:45.959330 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.203229 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.326425 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-config-data\") pod \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.326882 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-scripts\") pod \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.327074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-httpd-run\") pod \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.327164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-ceph\") pod \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.327336 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-logs\") pod \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.327420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-combined-ca-bundle\") pod \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.327504 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk8x9\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-kube-api-access-nk8x9\") pod \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\" (UID: \"4fbfdf78-632b-48cd-9206-5f584d9b7a5c\") " Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.327521 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4fbfdf78-632b-48cd-9206-5f584d9b7a5c" (UID: "4fbfdf78-632b-48cd-9206-5f584d9b7a5c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.327624 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-logs" (OuterVolumeSpecName: "logs") pod "4fbfdf78-632b-48cd-9206-5f584d9b7a5c" (UID: "4fbfdf78-632b-48cd-9206-5f584d9b7a5c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.328010 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.328082 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.331562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-ceph" (OuterVolumeSpecName: "ceph") pod "4fbfdf78-632b-48cd-9206-5f584d9b7a5c" (UID: "4fbfdf78-632b-48cd-9206-5f584d9b7a5c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.333717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-kube-api-access-nk8x9" (OuterVolumeSpecName: "kube-api-access-nk8x9") pod "4fbfdf78-632b-48cd-9206-5f584d9b7a5c" (UID: "4fbfdf78-632b-48cd-9206-5f584d9b7a5c"). InnerVolumeSpecName "kube-api-access-nk8x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.339939 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-scripts" (OuterVolumeSpecName: "scripts") pod "4fbfdf78-632b-48cd-9206-5f584d9b7a5c" (UID: "4fbfdf78-632b-48cd-9206-5f584d9b7a5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.361418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fbfdf78-632b-48cd-9206-5f584d9b7a5c" (UID: "4fbfdf78-632b-48cd-9206-5f584d9b7a5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.389605 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-config-data" (OuterVolumeSpecName: "config-data") pod "4fbfdf78-632b-48cd-9206-5f584d9b7a5c" (UID: "4fbfdf78-632b-48cd-9206-5f584d9b7a5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.429392 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk8x9\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-kube-api-access-nk8x9\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.429440 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.429450 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.429462 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.429471 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fbfdf78-632b-48cd-9206-5f584d9b7a5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.505439 4749 generic.go:334] "Generic (PLEG): container finished" podID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerID="f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5" exitCode=0 Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.505485 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zjx2" event={"ID":"2d59da2c-dca3-4884-bf1b-fd309b199012","Type":"ContainerDied","Data":"f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5"} Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.507526 4749 generic.go:334] "Generic (PLEG): container finished" podID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerID="bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f" exitCode=0 Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.507553 4749 generic.go:334] "Generic (PLEG): container finished" podID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerID="62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d" exitCode=143 Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.507573 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.507626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbfdf78-632b-48cd-9206-5f584d9b7a5c","Type":"ContainerDied","Data":"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f"} Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.507647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbfdf78-632b-48cd-9206-5f584d9b7a5c","Type":"ContainerDied","Data":"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d"} Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.507659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4fbfdf78-632b-48cd-9206-5f584d9b7a5c","Type":"ContainerDied","Data":"5bd9c78dc05458ca7791be8ddf7ff7504bed15dc141ed6732a9e85a9e063b109"} Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.507675 4749 scope.go:117] "RemoveContainer" containerID="bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.540059 4749 scope.go:117] "RemoveContainer" containerID="62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.550064 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.565316 4749 scope.go:117] "RemoveContainer" containerID="bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f" Nov 29 02:44:46 crc kubenswrapper[4749]: E1129 02:44:46.567346 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f\": container with ID starting with bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f not found: ID does not exist" containerID="bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.567378 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f"} err="failed to get container status \"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f\": rpc error: code = NotFound desc = could not find container \"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f\": container with ID starting with bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f not found: ID does not exist" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.567403 4749 scope.go:117] "RemoveContainer" containerID="62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d" Nov 29 02:44:46 crc kubenswrapper[4749]: E1129 02:44:46.567692 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d\": container with ID starting with 62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d not found: ID does not exist" containerID="62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.567745 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d"} err="failed to get container status \"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d\": rpc error: code = NotFound desc = could not find container \"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d\": container with ID starting with 62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d not found: ID does not exist" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.567780 4749 scope.go:117] "RemoveContainer" containerID="bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.568161 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f"} err="failed to get container status \"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f\": rpc error: code = NotFound desc = could not find container \"bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f\": container with ID starting with bf8fb322109febb5ca9e382c17d8efc6a7e8d7e421ee6e0025fe519ac0886a9f not found: ID does not exist" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.568186 4749 scope.go:117] "RemoveContainer" containerID="62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.568391 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d"} err="failed to get container status \"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d\": rpc error: code = NotFound desc = could not find container \"62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d\": container with ID starting with 62f6d45d488b5454803d2b2430dd01a6ef64aab087a08873fc21ff37934b7f1d not found: ID does not exist" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.570933 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.581610 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:46 crc kubenswrapper[4749]: E1129 02:44:46.582035 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerName="glance-httpd" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.582060 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerName="glance-httpd" Nov 29 02:44:46 crc kubenswrapper[4749]: E1129 02:44:46.582104 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerName="glance-log" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.582114 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerName="glance-log" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.582358 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerName="glance-httpd" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.582378 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" containerName="glance-log" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.583498 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.588850 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.593161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.632114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-ceph\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.632175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.632914 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.632934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.632979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.633050 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8994g\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-kube-api-access-8994g\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.633086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.734912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.735031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8994g\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-kube-api-access-8994g\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.735062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.735104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-ceph\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.735139 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.735186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.735230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.735739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.736439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-logs\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.741534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-ceph\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.741596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.742273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-config-data\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.742387 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-scripts\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.751343 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8994g\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-kube-api-access-8994g\") pod \"glance-default-external-api-0\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " pod="openstack/glance-default-external-api-0" Nov 29 02:44:46 crc kubenswrapper[4749]: I1129 02:44:46.902215 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:44:47 crc kubenswrapper[4749]: I1129 02:44:47.096100 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fbfdf78-632b-48cd-9206-5f584d9b7a5c" path="/var/lib/kubelet/pods/4fbfdf78-632b-48cd-9206-5f584d9b7a5c/volumes" Nov 29 02:44:47 crc kubenswrapper[4749]: I1129 02:44:47.512592 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:44:47 crc kubenswrapper[4749]: I1129 02:44:47.527620 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zjx2" event={"ID":"2d59da2c-dca3-4884-bf1b-fd309b199012","Type":"ContainerStarted","Data":"2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3"} Nov 29 02:44:47 crc kubenswrapper[4749]: I1129 02:44:47.529765 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerName="glance-log" containerID="cri-o://f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6" gracePeriod=30 Nov 29 02:44:47 crc kubenswrapper[4749]: I1129 02:44:47.529828 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerName="glance-httpd" containerID="cri-o://acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041" gracePeriod=30 Nov 29 02:44:47 crc kubenswrapper[4749]: I1129 02:44:47.561893 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6zjx2" podStartSLOduration=2.077238663 podStartE2EDuration="3.561873227s" podCreationTimestamp="2025-11-29 02:44:44 +0000 UTC" firstStartedPulling="2025-11-29 02:44:45.496040857 +0000 UTC m=+5628.668190734" lastFinishedPulling="2025-11-29 02:44:46.980675411 +0000 UTC m=+5630.152825298" observedRunningTime="2025-11-29 02:44:47.553152935 +0000 UTC m=+5630.725302832" watchObservedRunningTime="2025-11-29 02:44:47.561873227 +0000 UTC m=+5630.734023094" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.267500 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.361289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-ceph\") pod \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.361392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-config-data\") pod \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.361417 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-logs\") pod \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.361442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-httpd-run\") pod \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.361469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-combined-ca-bundle\") pod \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.361493 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-scripts\") pod \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.361553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7wtc\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-kube-api-access-j7wtc\") pod \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\" (UID: \"0be554f6-5e6e-43a7-b3bb-cfa7868d5593\") " Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.362011 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-logs" (OuterVolumeSpecName: "logs") pod "0be554f6-5e6e-43a7-b3bb-cfa7868d5593" (UID: "0be554f6-5e6e-43a7-b3bb-cfa7868d5593"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.362046 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0be554f6-5e6e-43a7-b3bb-cfa7868d5593" (UID: "0be554f6-5e6e-43a7-b3bb-cfa7868d5593"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.366664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-kube-api-access-j7wtc" (OuterVolumeSpecName: "kube-api-access-j7wtc") pod "0be554f6-5e6e-43a7-b3bb-cfa7868d5593" (UID: "0be554f6-5e6e-43a7-b3bb-cfa7868d5593"). InnerVolumeSpecName "kube-api-access-j7wtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.367078 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-scripts" (OuterVolumeSpecName: "scripts") pod "0be554f6-5e6e-43a7-b3bb-cfa7868d5593" (UID: "0be554f6-5e6e-43a7-b3bb-cfa7868d5593"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.371323 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-ceph" (OuterVolumeSpecName: "ceph") pod "0be554f6-5e6e-43a7-b3bb-cfa7868d5593" (UID: "0be554f6-5e6e-43a7-b3bb-cfa7868d5593"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.413787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-config-data" (OuterVolumeSpecName: "config-data") pod "0be554f6-5e6e-43a7-b3bb-cfa7868d5593" (UID: "0be554f6-5e6e-43a7-b3bb-cfa7868d5593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.416591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0be554f6-5e6e-43a7-b3bb-cfa7868d5593" (UID: "0be554f6-5e6e-43a7-b3bb-cfa7868d5593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.463391 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.463441 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.463457 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.463468 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.463483 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.463514 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.463526 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7wtc\" (UniqueName: \"kubernetes.io/projected/0be554f6-5e6e-43a7-b3bb-cfa7868d5593-kube-api-access-j7wtc\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.538453 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"073b63e6-4340-476c-bccb-23bbd6c43cf9","Type":"ContainerStarted","Data":"bc21b65a113c4ec0128958fd2424033a7bf8824181a1d2f04d4c0311ea08185a"} Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.538505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"073b63e6-4340-476c-bccb-23bbd6c43cf9","Type":"ContainerStarted","Data":"94b04ddf20a343d4db492cd73f639dc05ca56fa859653f37575ae66d9ca37377"} Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.540238 4749 generic.go:334] "Generic (PLEG): container finished" podID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerID="acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041" exitCode=0 Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.540268 4749 generic.go:334] "Generic (PLEG): container finished" podID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerID="f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6" exitCode=143 Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.540281 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.540339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0be554f6-5e6e-43a7-b3bb-cfa7868d5593","Type":"ContainerDied","Data":"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041"} Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.540365 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0be554f6-5e6e-43a7-b3bb-cfa7868d5593","Type":"ContainerDied","Data":"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6"} Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.540374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0be554f6-5e6e-43a7-b3bb-cfa7868d5593","Type":"ContainerDied","Data":"6293d92b8cfd39ebf966dc5fffb3d45466be9e0ba1671b9046b735953d46636a"} Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.540387 4749 scope.go:117] "RemoveContainer" containerID="acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.576958 4749 scope.go:117] "RemoveContainer" containerID="f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.582926 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.592212 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.602677 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:48 crc kubenswrapper[4749]: E1129 02:44:48.603266 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerName="glance-log" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.603364 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerName="glance-log" Nov 29 02:44:48 crc kubenswrapper[4749]: E1129 02:44:48.603497 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerName="glance-httpd" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.603574 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerName="glance-httpd" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.603836 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerName="glance-log" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.603963 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" containerName="glance-httpd" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.605109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.607446 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.609382 4749 scope.go:117] "RemoveContainer" containerID="acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041" Nov 29 02:44:48 crc kubenswrapper[4749]: E1129 02:44:48.612704 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041\": container with ID starting with acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041 not found: ID does not exist" containerID="acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.612741 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041"} err="failed to get container status \"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041\": rpc error: code = NotFound desc = could not find container \"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041\": container with ID starting with acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041 not found: ID does not exist" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.612768 4749 scope.go:117] "RemoveContainer" containerID="f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6" Nov 29 02:44:48 crc kubenswrapper[4749]: E1129 02:44:48.613089 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6\": container with ID starting with f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6 not found: ID does not exist" containerID="f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.613232 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6"} err="failed to get container status \"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6\": rpc error: code = NotFound desc = could not find container \"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6\": container with ID starting with f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6 not found: ID does not exist" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.613326 4749 scope.go:117] "RemoveContainer" containerID="acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.613523 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.615438 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041"} err="failed to get container status \"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041\": rpc error: code = NotFound desc = could not find container \"acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041\": container with ID starting with acd2a7b5491e77c1181ba677d9105abede469d803247b5a9acde20b565c40041 not found: ID does not exist" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.615480 4749 scope.go:117] "RemoveContainer" containerID="f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.617431 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6"} err="failed to get container status \"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6\": rpc error: code = NotFound desc = could not find container \"f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6\": container with ID starting with f5d5ce6f9b625c57de685e7eb338831eb0f35b44c32927db0d3f3231fe0b19a6 not found: ID does not exist" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.666489 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-ceph\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.666564 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-logs\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.666692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gp6p\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-kube-api-access-7gp6p\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.666789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.666829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.667003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.667284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.767842 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.767886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-ceph\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.767919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-logs\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.767944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gp6p\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-kube-api-access-7gp6p\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.767967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.767985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.768028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.768797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-logs\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.769396 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.772054 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.772127 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-ceph\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.775504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-config-data\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.776183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-scripts\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.789172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gp6p\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-kube-api-access-7gp6p\") pod \"glance-default-internal-api-0\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:44:48 crc kubenswrapper[4749]: I1129 02:44:48.934671 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:49 crc kubenswrapper[4749]: I1129 02:44:49.095566 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be554f6-5e6e-43a7-b3bb-cfa7868d5593" path="/var/lib/kubelet/pods/0be554f6-5e6e-43a7-b3bb-cfa7868d5593/volumes" Nov 29 02:44:49 crc kubenswrapper[4749]: I1129 02:44:49.340447 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:44:49 crc kubenswrapper[4749]: W1129 02:44:49.380351 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod282ac302_d5c0_4c95_b5d5_708cbaa5fc18.slice/crio-587f30738e34d7de46637308b74c44d72e891f6832a333f856a9d5daf69c07f9 WatchSource:0}: Error finding container 587f30738e34d7de46637308b74c44d72e891f6832a333f856a9d5daf69c07f9: Status 404 returned error can't find the container with id 587f30738e34d7de46637308b74c44d72e891f6832a333f856a9d5daf69c07f9 Nov 29 02:44:49 crc kubenswrapper[4749]: I1129 02:44:49.550376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"073b63e6-4340-476c-bccb-23bbd6c43cf9","Type":"ContainerStarted","Data":"86c570737b7092e8f37dab2c41692f1a45efbe0d3366fb17c0d1f4ebc4e850fa"} Nov 29 02:44:49 crc kubenswrapper[4749]: I1129 02:44:49.553679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"282ac302-d5c0-4c95-b5d5-708cbaa5fc18","Type":"ContainerStarted","Data":"587f30738e34d7de46637308b74c44d72e891f6832a333f856a9d5daf69c07f9"} Nov 29 02:44:49 crc kubenswrapper[4749]: I1129 02:44:49.567322 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.5673069 podStartE2EDuration="3.5673069s" podCreationTimestamp="2025-11-29 02:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:44:49.566221604 +0000 UTC m=+5632.738371461" watchObservedRunningTime="2025-11-29 02:44:49.5673069 +0000 UTC m=+5632.739456757" Nov 29 02:44:50 crc kubenswrapper[4749]: I1129 02:44:50.581030 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"282ac302-d5c0-4c95-b5d5-708cbaa5fc18","Type":"ContainerStarted","Data":"b4b474e25fca289df9a0f5c7b40be14890f107ba84fd7a0285968d84847ff1f4"} Nov 29 02:44:50 crc kubenswrapper[4749]: I1129 02:44:50.581428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"282ac302-d5c0-4c95-b5d5-708cbaa5fc18","Type":"ContainerStarted","Data":"dcfa7565324a18c1ef573fc7854d07f9c9a0fe825fcaa9951a352bff69ecb5de"} Nov 29 02:44:50 crc kubenswrapper[4749]: I1129 02:44:50.629692 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.6296646 podStartE2EDuration="2.6296646s" podCreationTimestamp="2025-11-29 02:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:44:50.611406557 +0000 UTC m=+5633.783556454" watchObservedRunningTime="2025-11-29 02:44:50.6296646 +0000 UTC m=+5633.801814487" Nov 29 02:44:53 crc kubenswrapper[4749]: I1129 02:44:53.211460 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:44:53 crc kubenswrapper[4749]: I1129 02:44:53.309942 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fb47545c-tw6kq"] Nov 29 02:44:53 crc kubenswrapper[4749]: I1129 02:44:53.310296 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" podUID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" containerName="dnsmasq-dns" containerID="cri-o://78fdae179267d59b9cc2ecc695311fb4612db89834dbb136a24abf186878bd55" gracePeriod=10 Nov 29 02:44:53 crc kubenswrapper[4749]: I1129 02:44:53.613515 4749 generic.go:334] "Generic (PLEG): container finished" podID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" containerID="78fdae179267d59b9cc2ecc695311fb4612db89834dbb136a24abf186878bd55" exitCode=0 Nov 29 02:44:53 crc kubenswrapper[4749]: I1129 02:44:53.613554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" event={"ID":"d0c5eae2-92c4-4eb1-86e6-ea728e456f00","Type":"ContainerDied","Data":"78fdae179267d59b9cc2ecc695311fb4612db89834dbb136a24abf186878bd55"} Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.404168 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.598774 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-nb\") pod \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.598974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-sb\") pod \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.599031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlmkt\" (UniqueName: \"kubernetes.io/projected/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-kube-api-access-vlmkt\") pod \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.599125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-config\") pod \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.599276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-dns-svc\") pod \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\" (UID: \"d0c5eae2-92c4-4eb1-86e6-ea728e456f00\") " Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.606165 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-kube-api-access-vlmkt" (OuterVolumeSpecName: "kube-api-access-vlmkt") pod "d0c5eae2-92c4-4eb1-86e6-ea728e456f00" (UID: "d0c5eae2-92c4-4eb1-86e6-ea728e456f00"). InnerVolumeSpecName "kube-api-access-vlmkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.631375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" event={"ID":"d0c5eae2-92c4-4eb1-86e6-ea728e456f00","Type":"ContainerDied","Data":"c7ed947c8e09058ff2399f3edebdf226961e46ff8a3228c00a973f61d822c02c"} Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.631435 4749 scope.go:117] "RemoveContainer" containerID="78fdae179267d59b9cc2ecc695311fb4612db89834dbb136a24abf186878bd55" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.631435 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fb47545c-tw6kq" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.655039 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-config" (OuterVolumeSpecName: "config") pod "d0c5eae2-92c4-4eb1-86e6-ea728e456f00" (UID: "d0c5eae2-92c4-4eb1-86e6-ea728e456f00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.677259 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0c5eae2-92c4-4eb1-86e6-ea728e456f00" (UID: "d0c5eae2-92c4-4eb1-86e6-ea728e456f00"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.680312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0c5eae2-92c4-4eb1-86e6-ea728e456f00" (UID: "d0c5eae2-92c4-4eb1-86e6-ea728e456f00"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.686909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0c5eae2-92c4-4eb1-86e6-ea728e456f00" (UID: "d0c5eae2-92c4-4eb1-86e6-ea728e456f00"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.702017 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.702085 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlmkt\" (UniqueName: \"kubernetes.io/projected/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-kube-api-access-vlmkt\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.702115 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.702141 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.702166 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0c5eae2-92c4-4eb1-86e6-ea728e456f00-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.741678 4749 scope.go:117] "RemoveContainer" containerID="7190fdd43a19bf7e1da031ac07fe4e2a9e41a01fa38d2a073d6dd37c7bc43e70" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.789273 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.790643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.848331 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:54 crc kubenswrapper[4749]: I1129 02:44:54.984549 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fb47545c-tw6kq"] Nov 29 02:44:55 crc kubenswrapper[4749]: I1129 02:44:55.004576 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fb47545c-tw6kq"] Nov 29 02:44:55 crc kubenswrapper[4749]: I1129 02:44:55.092075 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" path="/var/lib/kubelet/pods/d0c5eae2-92c4-4eb1-86e6-ea728e456f00/volumes" Nov 29 02:44:55 crc kubenswrapper[4749]: I1129 02:44:55.374713 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:44:55 crc kubenswrapper[4749]: I1129 02:44:55.374794 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:44:55 crc kubenswrapper[4749]: I1129 02:44:55.731726 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:55 crc kubenswrapper[4749]: I1129 02:44:55.799497 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zjx2"] Nov 29 02:44:56 crc kubenswrapper[4749]: I1129 02:44:56.902617 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 02:44:56 crc kubenswrapper[4749]: I1129 02:44:56.902708 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 02:44:56 crc kubenswrapper[4749]: I1129 02:44:56.951628 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 02:44:56 crc kubenswrapper[4749]: I1129 02:44:56.955823 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 02:44:57 crc kubenswrapper[4749]: I1129 02:44:57.665390 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6zjx2" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerName="registry-server" containerID="cri-o://2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3" gracePeriod=2 Nov 29 02:44:57 crc kubenswrapper[4749]: I1129 02:44:57.666097 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 02:44:57 crc kubenswrapper[4749]: I1129 02:44:57.666257 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.220924 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.276672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj5g4\" (UniqueName: \"kubernetes.io/projected/2d59da2c-dca3-4884-bf1b-fd309b199012-kube-api-access-tj5g4\") pod \"2d59da2c-dca3-4884-bf1b-fd309b199012\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.276719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-catalog-content\") pod \"2d59da2c-dca3-4884-bf1b-fd309b199012\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.276771 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-utilities\") pod \"2d59da2c-dca3-4884-bf1b-fd309b199012\" (UID: \"2d59da2c-dca3-4884-bf1b-fd309b199012\") " Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.278793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-utilities" (OuterVolumeSpecName: "utilities") pod "2d59da2c-dca3-4884-bf1b-fd309b199012" (UID: "2d59da2c-dca3-4884-bf1b-fd309b199012"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.297679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d59da2c-dca3-4884-bf1b-fd309b199012" (UID: "2d59da2c-dca3-4884-bf1b-fd309b199012"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.306050 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d59da2c-dca3-4884-bf1b-fd309b199012-kube-api-access-tj5g4" (OuterVolumeSpecName: "kube-api-access-tj5g4") pod "2d59da2c-dca3-4884-bf1b-fd309b199012" (UID: "2d59da2c-dca3-4884-bf1b-fd309b199012"). InnerVolumeSpecName "kube-api-access-tj5g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.379589 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.379623 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj5g4\" (UniqueName: \"kubernetes.io/projected/2d59da2c-dca3-4884-bf1b-fd309b199012-kube-api-access-tj5g4\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.379633 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d59da2c-dca3-4884-bf1b-fd309b199012-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.681944 4749 generic.go:334] "Generic (PLEG): container finished" podID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerID="2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3" exitCode=0 Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.682827 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zjx2" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.685543 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zjx2" event={"ID":"2d59da2c-dca3-4884-bf1b-fd309b199012","Type":"ContainerDied","Data":"2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3"} Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.685641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zjx2" event={"ID":"2d59da2c-dca3-4884-bf1b-fd309b199012","Type":"ContainerDied","Data":"529cd6c1fb1ddbbe0099371f008c97912de1b566cf7fd3173170e772e9ca734c"} Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.685720 4749 scope.go:117] "RemoveContainer" containerID="2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.746422 4749 scope.go:117] "RemoveContainer" containerID="f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.755958 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zjx2"] Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.773948 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zjx2"] Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.793894 4749 scope.go:117] "RemoveContainer" containerID="cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.827899 4749 scope.go:117] "RemoveContainer" containerID="2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3" Nov 29 02:44:58 crc kubenswrapper[4749]: E1129 02:44:58.828220 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3\": container with ID starting with 2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3 not found: ID does not exist" containerID="2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.828255 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3"} err="failed to get container status \"2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3\": rpc error: code = NotFound desc = could not find container \"2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3\": container with ID starting with 2ed229e32881f0b0dcd0ed13da8999e449b12dc892adf5d9032fa3342ebfa8d3 not found: ID does not exist" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.828275 4749 scope.go:117] "RemoveContainer" containerID="f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5" Nov 29 02:44:58 crc kubenswrapper[4749]: E1129 02:44:58.828534 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5\": container with ID starting with f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5 not found: ID does not exist" containerID="f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.828566 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5"} err="failed to get container status \"f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5\": rpc error: code = NotFound desc = could not find container \"f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5\": container with ID starting with f48c63f6c49e347495f59e36b9e8228db6a86f6b97f73a84fe3977af0f89dac5 not found: ID does not exist" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.828583 4749 scope.go:117] "RemoveContainer" containerID="cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6" Nov 29 02:44:58 crc kubenswrapper[4749]: E1129 02:44:58.828801 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6\": container with ID starting with cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6 not found: ID does not exist" containerID="cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.828826 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6"} err="failed to get container status \"cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6\": rpc error: code = NotFound desc = could not find container \"cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6\": container with ID starting with cca61c79bd421fc705524b044e7ec59bc2e32d6909c94c2923a580dabd1827d6 not found: ID does not exist" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.934907 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.936278 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:58 crc kubenswrapper[4749]: I1129 02:44:58.961976 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:59 crc kubenswrapper[4749]: I1129 02:44:59.024723 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:59 crc kubenswrapper[4749]: I1129 02:44:59.084383 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" path="/var/lib/kubelet/pods/2d59da2c-dca3-4884-bf1b-fd309b199012/volumes" Nov 29 02:44:59 crc kubenswrapper[4749]: I1129 02:44:59.546398 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 02:44:59 crc kubenswrapper[4749]: I1129 02:44:59.583810 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 02:44:59 crc kubenswrapper[4749]: I1129 02:44:59.695519 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 02:44:59 crc kubenswrapper[4749]: I1129 02:44:59.695575 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.160443 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj"] Nov 29 02:45:00 crc kubenswrapper[4749]: E1129 02:45:00.161024 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerName="extract-content" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.161054 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerName="extract-content" Nov 29 02:45:00 crc kubenswrapper[4749]: E1129 02:45:00.161102 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" containerName="init" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.161118 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" containerName="init" Nov 29 02:45:00 crc kubenswrapper[4749]: E1129 02:45:00.161156 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerName="registry-server" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.161173 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerName="registry-server" Nov 29 02:45:00 crc kubenswrapper[4749]: E1129 02:45:00.161244 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" containerName="dnsmasq-dns" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.161262 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" containerName="dnsmasq-dns" Nov 29 02:45:00 crc kubenswrapper[4749]: E1129 02:45:00.161300 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerName="extract-utilities" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.161317 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerName="extract-utilities" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.161614 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d59da2c-dca3-4884-bf1b-fd309b199012" containerName="registry-server" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.161645 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c5eae2-92c4-4eb1-86e6-ea728e456f00" containerName="dnsmasq-dns" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.162865 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.166822 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.167927 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.176424 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj"] Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.221776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-config-volume\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.221875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-secret-volume\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.221904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqfgd\" (UniqueName: \"kubernetes.io/projected/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-kube-api-access-tqfgd\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.323380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-config-volume\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.323465 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-secret-volume\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.323524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqfgd\" (UniqueName: \"kubernetes.io/projected/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-kube-api-access-tqfgd\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.325039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-config-volume\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.331234 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-secret-volume\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.341450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqfgd\" (UniqueName: \"kubernetes.io/projected/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-kube-api-access-tqfgd\") pod \"collect-profiles-29406405-gz8rj\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.534166 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:00 crc kubenswrapper[4749]: W1129 02:45:00.985544 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7f0604d_7a7b_4d66_8a77_f389c7a7d406.slice/crio-d3aff83127ef0c95b5fd3541fe97381058b98a691653efbed828fc7414eff6f6 WatchSource:0}: Error finding container d3aff83127ef0c95b5fd3541fe97381058b98a691653efbed828fc7414eff6f6: Status 404 returned error can't find the container with id d3aff83127ef0c95b5fd3541fe97381058b98a691653efbed828fc7414eff6f6 Nov 29 02:45:00 crc kubenswrapper[4749]: I1129 02:45:00.991718 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj"] Nov 29 02:45:01 crc kubenswrapper[4749]: I1129 02:45:01.685990 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 02:45:01 crc kubenswrapper[4749]: I1129 02:45:01.716854 4749 generic.go:334] "Generic (PLEG): container finished" podID="f7f0604d-7a7b-4d66-8a77-f389c7a7d406" containerID="5744420dc8629deaf306a5678b4b436247af9eae850f8d1d3753e13a31600aad" exitCode=0 Nov 29 02:45:01 crc kubenswrapper[4749]: I1129 02:45:01.716926 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 02:45:01 crc kubenswrapper[4749]: I1129 02:45:01.717037 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" event={"ID":"f7f0604d-7a7b-4d66-8a77-f389c7a7d406","Type":"ContainerDied","Data":"5744420dc8629deaf306a5678b4b436247af9eae850f8d1d3753e13a31600aad"} Nov 29 02:45:01 crc kubenswrapper[4749]: I1129 02:45:01.717104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" event={"ID":"f7f0604d-7a7b-4d66-8a77-f389c7a7d406","Type":"ContainerStarted","Data":"d3aff83127ef0c95b5fd3541fe97381058b98a691653efbed828fc7414eff6f6"} Nov 29 02:45:01 crc kubenswrapper[4749]: I1129 02:45:01.732318 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.100983 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.284080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-config-volume\") pod \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.284261 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-secret-volume\") pod \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.284340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqfgd\" (UniqueName: \"kubernetes.io/projected/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-kube-api-access-tqfgd\") pod \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\" (UID: \"f7f0604d-7a7b-4d66-8a77-f389c7a7d406\") " Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.285163 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7f0604d-7a7b-4d66-8a77-f389c7a7d406" (UID: "f7f0604d-7a7b-4d66-8a77-f389c7a7d406"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.291747 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-kube-api-access-tqfgd" (OuterVolumeSpecName: "kube-api-access-tqfgd") pod "f7f0604d-7a7b-4d66-8a77-f389c7a7d406" (UID: "f7f0604d-7a7b-4d66-8a77-f389c7a7d406"). InnerVolumeSpecName "kube-api-access-tqfgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.305248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7f0604d-7a7b-4d66-8a77-f389c7a7d406" (UID: "f7f0604d-7a7b-4d66-8a77-f389c7a7d406"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.387555 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.387600 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqfgd\" (UniqueName: \"kubernetes.io/projected/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-kube-api-access-tqfgd\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.387615 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7f0604d-7a7b-4d66-8a77-f389c7a7d406-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.748412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" event={"ID":"f7f0604d-7a7b-4d66-8a77-f389c7a7d406","Type":"ContainerDied","Data":"d3aff83127ef0c95b5fd3541fe97381058b98a691653efbed828fc7414eff6f6"} Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.748754 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3aff83127ef0c95b5fd3541fe97381058b98a691653efbed828fc7414eff6f6" Nov 29 02:45:03 crc kubenswrapper[4749]: I1129 02:45:03.748476 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj" Nov 29 02:45:04 crc kubenswrapper[4749]: I1129 02:45:04.213877 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl"] Nov 29 02:45:04 crc kubenswrapper[4749]: I1129 02:45:04.223044 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406360-swldl"] Nov 29 02:45:05 crc kubenswrapper[4749]: I1129 02:45:05.085500 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f" path="/var/lib/kubelet/pods/ee8d7c5e-51c6-42e3-b2ff-26c596e1ba0f/volumes" Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.841938 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-j9x6q"] Nov 29 02:45:09 crc kubenswrapper[4749]: E1129 02:45:09.842962 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f0604d-7a7b-4d66-8a77-f389c7a7d406" containerName="collect-profiles" Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.842981 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f0604d-7a7b-4d66-8a77-f389c7a7d406" containerName="collect-profiles" Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.843271 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f0604d-7a7b-4d66-8a77-f389c7a7d406" containerName="collect-profiles" Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.844005 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.878581 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-j9x6q"] Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.937561 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-148b-account-create-update-xc8b2"] Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.938906 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.942010 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 29 02:45:09 crc kubenswrapper[4749]: I1129 02:45:09.947384 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-148b-account-create-update-xc8b2"] Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.016419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt5wm\" (UniqueName: \"kubernetes.io/projected/4e03156c-645e-4f64-8201-3343cc04f9d3-kube-api-access-nt5wm\") pod \"placement-db-create-j9x6q\" (UID: \"4e03156c-645e-4f64-8201-3343cc04f9d3\") " pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.016669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e03156c-645e-4f64-8201-3343cc04f9d3-operator-scripts\") pod \"placement-db-create-j9x6q\" (UID: \"4e03156c-645e-4f64-8201-3343cc04f9d3\") " pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.118686 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt5wm\" (UniqueName: \"kubernetes.io/projected/4e03156c-645e-4f64-8201-3343cc04f9d3-kube-api-access-nt5wm\") pod \"placement-db-create-j9x6q\" (UID: \"4e03156c-645e-4f64-8201-3343cc04f9d3\") " pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.119155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e03156c-645e-4f64-8201-3343cc04f9d3-operator-scripts\") pod \"placement-db-create-j9x6q\" (UID: \"4e03156c-645e-4f64-8201-3343cc04f9d3\") " pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.119409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhw7n\" (UniqueName: \"kubernetes.io/projected/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-kube-api-access-mhw7n\") pod \"placement-148b-account-create-update-xc8b2\" (UID: \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\") " pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.119682 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-operator-scripts\") pod \"placement-148b-account-create-update-xc8b2\" (UID: \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\") " pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.120358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e03156c-645e-4f64-8201-3343cc04f9d3-operator-scripts\") pod \"placement-db-create-j9x6q\" (UID: \"4e03156c-645e-4f64-8201-3343cc04f9d3\") " pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.157065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt5wm\" (UniqueName: \"kubernetes.io/projected/4e03156c-645e-4f64-8201-3343cc04f9d3-kube-api-access-nt5wm\") pod \"placement-db-create-j9x6q\" (UID: \"4e03156c-645e-4f64-8201-3343cc04f9d3\") " pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.164926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.220922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhw7n\" (UniqueName: \"kubernetes.io/projected/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-kube-api-access-mhw7n\") pod \"placement-148b-account-create-update-xc8b2\" (UID: \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\") " pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.221042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-operator-scripts\") pod \"placement-148b-account-create-update-xc8b2\" (UID: \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\") " pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.222626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-operator-scripts\") pod \"placement-148b-account-create-update-xc8b2\" (UID: \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\") " pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.243578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhw7n\" (UniqueName: \"kubernetes.io/projected/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-kube-api-access-mhw7n\") pod \"placement-148b-account-create-update-xc8b2\" (UID: \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\") " pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.263945 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.684281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-j9x6q"] Nov 29 02:45:10 crc kubenswrapper[4749]: W1129 02:45:10.690299 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e03156c_645e_4f64_8201_3343cc04f9d3.slice/crio-b99c2bccdff446c3c94e248705b783b2123d5ec802f9cadfe780266e02c78045 WatchSource:0}: Error finding container b99c2bccdff446c3c94e248705b783b2123d5ec802f9cadfe780266e02c78045: Status 404 returned error can't find the container with id b99c2bccdff446c3c94e248705b783b2123d5ec802f9cadfe780266e02c78045 Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.804920 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-148b-account-create-update-xc8b2"] Nov 29 02:45:10 crc kubenswrapper[4749]: W1129 02:45:10.805497 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b1f9731_f71b_428d_bc77_dc3b56eeb7c5.slice/crio-3a7de65aa44f9c5ee78c4489e45a5af48ad971e2951238d34594950ceff07943 WatchSource:0}: Error finding container 3a7de65aa44f9c5ee78c4489e45a5af48ad971e2951238d34594950ceff07943: Status 404 returned error can't find the container with id 3a7de65aa44f9c5ee78c4489e45a5af48ad971e2951238d34594950ceff07943 Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.854878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9x6q" event={"ID":"4e03156c-645e-4f64-8201-3343cc04f9d3","Type":"ContainerStarted","Data":"b99c2bccdff446c3c94e248705b783b2123d5ec802f9cadfe780266e02c78045"} Nov 29 02:45:10 crc kubenswrapper[4749]: I1129 02:45:10.856742 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-148b-account-create-update-xc8b2" event={"ID":"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5","Type":"ContainerStarted","Data":"3a7de65aa44f9c5ee78c4489e45a5af48ad971e2951238d34594950ceff07943"} Nov 29 02:45:11 crc kubenswrapper[4749]: I1129 02:45:11.871661 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e03156c-645e-4f64-8201-3343cc04f9d3" containerID="3ce065b02ede3505b2dde58138e4310a87b945e2a9ffc2213fb48f573ed255bd" exitCode=0 Nov 29 02:45:11 crc kubenswrapper[4749]: I1129 02:45:11.871882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9x6q" event={"ID":"4e03156c-645e-4f64-8201-3343cc04f9d3","Type":"ContainerDied","Data":"3ce065b02ede3505b2dde58138e4310a87b945e2a9ffc2213fb48f573ed255bd"} Nov 29 02:45:11 crc kubenswrapper[4749]: I1129 02:45:11.876710 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b1f9731-f71b-428d-bc77-dc3b56eeb7c5" containerID="6f8b989eb08b24efbf5ffb9d3cd95cea8f648225560eb6c8b6bd8e9bec4f7c75" exitCode=0 Nov 29 02:45:11 crc kubenswrapper[4749]: I1129 02:45:11.876764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-148b-account-create-update-xc8b2" event={"ID":"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5","Type":"ContainerDied","Data":"6f8b989eb08b24efbf5ffb9d3cd95cea8f648225560eb6c8b6bd8e9bec4f7c75"} Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.166272 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-llx8s"] Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.168829 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.172813 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llx8s"] Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.274457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvn4x\" (UniqueName: \"kubernetes.io/projected/2514916d-93e9-4c79-a031-07c466477796-kube-api-access-fvn4x\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.274508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-catalog-content\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.274637 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-utilities\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.371619 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.376550 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-utilities\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.376616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvn4x\" (UniqueName: \"kubernetes.io/projected/2514916d-93e9-4c79-a031-07c466477796-kube-api-access-fvn4x\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.376658 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-catalog-content\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.377221 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-catalog-content\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.377521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-utilities\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.397472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvn4x\" (UniqueName: \"kubernetes.io/projected/2514916d-93e9-4c79-a031-07c466477796-kube-api-access-fvn4x\") pod \"certified-operators-llx8s\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.478034 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-operator-scripts\") pod \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\" (UID: \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\") " Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.478159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhw7n\" (UniqueName: \"kubernetes.io/projected/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-kube-api-access-mhw7n\") pod \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\" (UID: \"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5\") " Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.478908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b1f9731-f71b-428d-bc77-dc3b56eeb7c5" (UID: "7b1f9731-f71b-428d-bc77-dc3b56eeb7c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.485453 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-kube-api-access-mhw7n" (OuterVolumeSpecName: "kube-api-access-mhw7n") pod "7b1f9731-f71b-428d-bc77-dc3b56eeb7c5" (UID: "7b1f9731-f71b-428d-bc77-dc3b56eeb7c5"). InnerVolumeSpecName "kube-api-access-mhw7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.509690 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.537906 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.584065 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.584107 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhw7n\" (UniqueName: \"kubernetes.io/projected/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5-kube-api-access-mhw7n\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.685829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e03156c-645e-4f64-8201-3343cc04f9d3-operator-scripts\") pod \"4e03156c-645e-4f64-8201-3343cc04f9d3\" (UID: \"4e03156c-645e-4f64-8201-3343cc04f9d3\") " Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.685905 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt5wm\" (UniqueName: \"kubernetes.io/projected/4e03156c-645e-4f64-8201-3343cc04f9d3-kube-api-access-nt5wm\") pod \"4e03156c-645e-4f64-8201-3343cc04f9d3\" (UID: \"4e03156c-645e-4f64-8201-3343cc04f9d3\") " Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.686768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e03156c-645e-4f64-8201-3343cc04f9d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e03156c-645e-4f64-8201-3343cc04f9d3" (UID: "4e03156c-645e-4f64-8201-3343cc04f9d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.692597 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e03156c-645e-4f64-8201-3343cc04f9d3-kube-api-access-nt5wm" (OuterVolumeSpecName: "kube-api-access-nt5wm") pod "4e03156c-645e-4f64-8201-3343cc04f9d3" (UID: "4e03156c-645e-4f64-8201-3343cc04f9d3"). InnerVolumeSpecName "kube-api-access-nt5wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.787841 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e03156c-645e-4f64-8201-3343cc04f9d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.787891 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt5wm\" (UniqueName: \"kubernetes.io/projected/4e03156c-645e-4f64-8201-3343cc04f9d3-kube-api-access-nt5wm\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.911845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-j9x6q" event={"ID":"4e03156c-645e-4f64-8201-3343cc04f9d3","Type":"ContainerDied","Data":"b99c2bccdff446c3c94e248705b783b2123d5ec802f9cadfe780266e02c78045"} Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.911881 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b99c2bccdff446c3c94e248705b783b2123d5ec802f9cadfe780266e02c78045" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.911946 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-j9x6q" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.915298 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-148b-account-create-update-xc8b2" event={"ID":"7b1f9731-f71b-428d-bc77-dc3b56eeb7c5","Type":"ContainerDied","Data":"3a7de65aa44f9c5ee78c4489e45a5af48ad971e2951238d34594950ceff07943"} Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.915332 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a7de65aa44f9c5ee78c4489e45a5af48ad971e2951238d34594950ceff07943" Nov 29 02:45:13 crc kubenswrapper[4749]: I1129 02:45:13.915352 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-148b-account-create-update-xc8b2" Nov 29 02:45:14 crc kubenswrapper[4749]: W1129 02:45:14.020929 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2514916d_93e9_4c79_a031_07c466477796.slice/crio-e30ff8e3824235c62df749f0db51c423e67b7a6335752ec6efdca495dbba3e02 WatchSource:0}: Error finding container e30ff8e3824235c62df749f0db51c423e67b7a6335752ec6efdca495dbba3e02: Status 404 returned error can't find the container with id e30ff8e3824235c62df749f0db51c423e67b7a6335752ec6efdca495dbba3e02 Nov 29 02:45:14 crc kubenswrapper[4749]: I1129 02:45:14.022579 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llx8s"] Nov 29 02:45:14 crc kubenswrapper[4749]: I1129 02:45:14.464168 4749 scope.go:117] "RemoveContainer" containerID="13da0910bcff0cef3358d8a28c1988d32e43c4c9bb53271d26bab119f3d2e2d9" Nov 29 02:45:14 crc kubenswrapper[4749]: I1129 02:45:14.930688 4749 generic.go:334] "Generic (PLEG): container finished" podID="2514916d-93e9-4c79-a031-07c466477796" containerID="736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6" exitCode=0 Nov 29 02:45:14 crc kubenswrapper[4749]: I1129 02:45:14.930761 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llx8s" event={"ID":"2514916d-93e9-4c79-a031-07c466477796","Type":"ContainerDied","Data":"736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6"} Nov 29 02:45:14 crc kubenswrapper[4749]: I1129 02:45:14.931119 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llx8s" event={"ID":"2514916d-93e9-4c79-a031-07c466477796","Type":"ContainerStarted","Data":"e30ff8e3824235c62df749f0db51c423e67b7a6335752ec6efdca495dbba3e02"} Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.381286 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94478fffc-flf7q"] Nov 29 02:45:15 crc kubenswrapper[4749]: E1129 02:45:15.381668 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1f9731-f71b-428d-bc77-dc3b56eeb7c5" containerName="mariadb-account-create-update" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.381685 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1f9731-f71b-428d-bc77-dc3b56eeb7c5" containerName="mariadb-account-create-update" Nov 29 02:45:15 crc kubenswrapper[4749]: E1129 02:45:15.381716 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e03156c-645e-4f64-8201-3343cc04f9d3" containerName="mariadb-database-create" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.381723 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e03156c-645e-4f64-8201-3343cc04f9d3" containerName="mariadb-database-create" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.381889 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b1f9731-f71b-428d-bc77-dc3b56eeb7c5" containerName="mariadb-account-create-update" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.381903 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e03156c-645e-4f64-8201-3343cc04f9d3" containerName="mariadb-database-create" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.382804 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.409569 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-v6nvh"] Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.411032 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.415696 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94478fffc-flf7q"] Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.415879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.415971 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fr5gf" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.416280 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.423411 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v6nvh"] Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.521883 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-logs\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.521929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrprq\" (UniqueName: \"kubernetes.io/projected/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-kube-api-access-jrprq\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.521978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-combined-ca-bundle\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.522010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-sb\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.522058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znj6j\" (UniqueName: \"kubernetes.io/projected/c3537239-98b7-4e2d-a800-36f8cdac49fe-kube-api-access-znj6j\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.522077 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-config-data\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.522107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-scripts\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.522124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-dns-svc\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.522154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-config\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.522169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-nb\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624105 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-logs\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624506 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrprq\" (UniqueName: \"kubernetes.io/projected/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-kube-api-access-jrprq\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624567 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-combined-ca-bundle\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-logs\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-sb\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624807 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znj6j\" (UniqueName: \"kubernetes.io/projected/c3537239-98b7-4e2d-a800-36f8cdac49fe-kube-api-access-znj6j\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624837 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-config-data\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-scripts\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.624929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-dns-svc\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.625003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-config\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.625019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-nb\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.625557 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-sb\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.626109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-nb\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.626917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-config\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.627070 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-dns-svc\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.630496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-scripts\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.630964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-combined-ca-bundle\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.631345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-config-data\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.642621 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znj6j\" (UniqueName: \"kubernetes.io/projected/c3537239-98b7-4e2d-a800-36f8cdac49fe-kube-api-access-znj6j\") pod \"dnsmasq-dns-94478fffc-flf7q\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.644250 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrprq\" (UniqueName: \"kubernetes.io/projected/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-kube-api-access-jrprq\") pod \"placement-db-sync-v6nvh\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.715818 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:15 crc kubenswrapper[4749]: I1129 02:45:15.737300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.359568 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v6nvh"] Nov 29 02:45:16 crc kubenswrapper[4749]: W1129 02:45:16.361939 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d34f843_bfea_4a0c_ac09_3d6da8ac941d.slice/crio-8ce9ebd4039f4c834b5755b467322e4fd5a4af73763afefcb52f82e8e4372b7b WatchSource:0}: Error finding container 8ce9ebd4039f4c834b5755b467322e4fd5a4af73763afefcb52f82e8e4372b7b: Status 404 returned error can't find the container with id 8ce9ebd4039f4c834b5755b467322e4fd5a4af73763afefcb52f82e8e4372b7b Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.441037 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94478fffc-flf7q"] Nov 29 02:45:16 crc kubenswrapper[4749]: W1129 02:45:16.446688 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3537239_98b7_4e2d_a800_36f8cdac49fe.slice/crio-1604568eed9be31a843ec034fc4e79fc88f33655205a7713481452cac05e07ce WatchSource:0}: Error finding container 1604568eed9be31a843ec034fc4e79fc88f33655205a7713481452cac05e07ce: Status 404 returned error can't find the container with id 1604568eed9be31a843ec034fc4e79fc88f33655205a7713481452cac05e07ce Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.973402 4749 generic.go:334] "Generic (PLEG): container finished" podID="2514916d-93e9-4c79-a031-07c466477796" containerID="1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d" exitCode=0 Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.973455 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llx8s" event={"ID":"2514916d-93e9-4c79-a031-07c466477796","Type":"ContainerDied","Data":"1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d"} Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.979797 4749 generic.go:334] "Generic (PLEG): container finished" podID="c3537239-98b7-4e2d-a800-36f8cdac49fe" containerID="58d53ddbb1814f6b1a145376d1a9e209cdff8b3693a99395ea76ce44744125e3" exitCode=0 Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.979863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94478fffc-flf7q" event={"ID":"c3537239-98b7-4e2d-a800-36f8cdac49fe","Type":"ContainerDied","Data":"58d53ddbb1814f6b1a145376d1a9e209cdff8b3693a99395ea76ce44744125e3"} Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.979918 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94478fffc-flf7q" event={"ID":"c3537239-98b7-4e2d-a800-36f8cdac49fe","Type":"ContainerStarted","Data":"1604568eed9be31a843ec034fc4e79fc88f33655205a7713481452cac05e07ce"} Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.982991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v6nvh" event={"ID":"7d34f843-bfea-4a0c-ac09-3d6da8ac941d","Type":"ContainerStarted","Data":"71e5a30db6c4b1de3d06b6020bfe9670a9c0edd6f88dc7fdd63b171397efce0e"} Nov 29 02:45:16 crc kubenswrapper[4749]: I1129 02:45:16.985939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v6nvh" event={"ID":"7d34f843-bfea-4a0c-ac09-3d6da8ac941d","Type":"ContainerStarted","Data":"8ce9ebd4039f4c834b5755b467322e4fd5a4af73763afefcb52f82e8e4372b7b"} Nov 29 02:45:17 crc kubenswrapper[4749]: I1129 02:45:17.095778 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-v6nvh" podStartSLOduration=2.095754233 podStartE2EDuration="2.095754233s" podCreationTimestamp="2025-11-29 02:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:45:17.069815123 +0000 UTC m=+5660.241964990" watchObservedRunningTime="2025-11-29 02:45:17.095754233 +0000 UTC m=+5660.267904120" Nov 29 02:45:17 crc kubenswrapper[4749]: I1129 02:45:17.994362 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94478fffc-flf7q" event={"ID":"c3537239-98b7-4e2d-a800-36f8cdac49fe","Type":"ContainerStarted","Data":"cf45b96353708baff62b8651c040b67d94020a81c9fd3bfd5af439baa31194b8"} Nov 29 02:45:17 crc kubenswrapper[4749]: I1129 02:45:17.994633 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:17 crc kubenswrapper[4749]: I1129 02:45:17.997542 4749 generic.go:334] "Generic (PLEG): container finished" podID="7d34f843-bfea-4a0c-ac09-3d6da8ac941d" containerID="71e5a30db6c4b1de3d06b6020bfe9670a9c0edd6f88dc7fdd63b171397efce0e" exitCode=0 Nov 29 02:45:17 crc kubenswrapper[4749]: I1129 02:45:17.997621 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v6nvh" event={"ID":"7d34f843-bfea-4a0c-ac09-3d6da8ac941d","Type":"ContainerDied","Data":"71e5a30db6c4b1de3d06b6020bfe9670a9c0edd6f88dc7fdd63b171397efce0e"} Nov 29 02:45:18 crc kubenswrapper[4749]: I1129 02:45:18.001110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llx8s" event={"ID":"2514916d-93e9-4c79-a031-07c466477796","Type":"ContainerStarted","Data":"81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b"} Nov 29 02:45:18 crc kubenswrapper[4749]: I1129 02:45:18.028738 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94478fffc-flf7q" podStartSLOduration=3.028702389 podStartE2EDuration="3.028702389s" podCreationTimestamp="2025-11-29 02:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:45:18.018659225 +0000 UTC m=+5661.190809092" watchObservedRunningTime="2025-11-29 02:45:18.028702389 +0000 UTC m=+5661.200852336" Nov 29 02:45:18 crc kubenswrapper[4749]: I1129 02:45:18.081843 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-llx8s" podStartSLOduration=2.5893374700000003 podStartE2EDuration="5.08182313s" podCreationTimestamp="2025-11-29 02:45:13 +0000 UTC" firstStartedPulling="2025-11-29 02:45:14.935039977 +0000 UTC m=+5658.107189864" lastFinishedPulling="2025-11-29 02:45:17.427525657 +0000 UTC m=+5660.599675524" observedRunningTime="2025-11-29 02:45:18.066092888 +0000 UTC m=+5661.238242765" watchObservedRunningTime="2025-11-29 02:45:18.08182313 +0000 UTC m=+5661.253972997" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.443504 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.510015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-combined-ca-bundle\") pod \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.510097 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-config-data\") pod \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.510187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-scripts\") pod \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.510330 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-logs\") pod \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.510638 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-logs" (OuterVolumeSpecName: "logs") pod "7d34f843-bfea-4a0c-ac09-3d6da8ac941d" (UID: "7d34f843-bfea-4a0c-ac09-3d6da8ac941d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.510668 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrprq\" (UniqueName: \"kubernetes.io/projected/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-kube-api-access-jrprq\") pod \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\" (UID: \"7d34f843-bfea-4a0c-ac09-3d6da8ac941d\") " Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.511033 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.516014 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-kube-api-access-jrprq" (OuterVolumeSpecName: "kube-api-access-jrprq") pod "7d34f843-bfea-4a0c-ac09-3d6da8ac941d" (UID: "7d34f843-bfea-4a0c-ac09-3d6da8ac941d"). InnerVolumeSpecName "kube-api-access-jrprq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.516506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-scripts" (OuterVolumeSpecName: "scripts") pod "7d34f843-bfea-4a0c-ac09-3d6da8ac941d" (UID: "7d34f843-bfea-4a0c-ac09-3d6da8ac941d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.535688 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-config-data" (OuterVolumeSpecName: "config-data") pod "7d34f843-bfea-4a0c-ac09-3d6da8ac941d" (UID: "7d34f843-bfea-4a0c-ac09-3d6da8ac941d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.545361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d34f843-bfea-4a0c-ac09-3d6da8ac941d" (UID: "7d34f843-bfea-4a0c-ac09-3d6da8ac941d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.612413 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrprq\" (UniqueName: \"kubernetes.io/projected/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-kube-api-access-jrprq\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.612474 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.612488 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:19 crc kubenswrapper[4749]: I1129 02:45:19.612499 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d34f843-bfea-4a0c-ac09-3d6da8ac941d-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.027107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v6nvh" event={"ID":"7d34f843-bfea-4a0c-ac09-3d6da8ac941d","Type":"ContainerDied","Data":"8ce9ebd4039f4c834b5755b467322e4fd5a4af73763afefcb52f82e8e4372b7b"} Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.027168 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ce9ebd4039f4c834b5755b467322e4fd5a4af73763afefcb52f82e8e4372b7b" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.027251 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v6nvh" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.589495 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-567fc5ff4d-d2cnq"] Nov 29 02:45:20 crc kubenswrapper[4749]: E1129 02:45:20.590233 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d34f843-bfea-4a0c-ac09-3d6da8ac941d" containerName="placement-db-sync" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.590268 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d34f843-bfea-4a0c-ac09-3d6da8ac941d" containerName="placement-db-sync" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.590714 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d34f843-bfea-4a0c-ac09-3d6da8ac941d" containerName="placement-db-sync" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.592930 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.635460 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-567fc5ff4d-d2cnq"] Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.637764 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.638119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.638536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fr5gf" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.736553 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-scripts\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.736632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-combined-ca-bundle\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.736809 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6g5b\" (UniqueName: \"kubernetes.io/projected/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-kube-api-access-m6g5b\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.737407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-logs\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.737486 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-config-data\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.839115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-logs\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.839168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-config-data\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.839215 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-scripts\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.839238 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-combined-ca-bundle\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.839269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6g5b\" (UniqueName: \"kubernetes.io/projected/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-kube-api-access-m6g5b\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.839961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-logs\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.845157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-combined-ca-bundle\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.846900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-config-data\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.847060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-scripts\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.853382 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6g5b\" (UniqueName: \"kubernetes.io/projected/bd4b1978-3d21-4daa-89b8-292d5e4cdf9e-kube-api-access-m6g5b\") pod \"placement-567fc5ff4d-d2cnq\" (UID: \"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e\") " pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:20 crc kubenswrapper[4749]: I1129 02:45:20.958617 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:21 crc kubenswrapper[4749]: W1129 02:45:21.469351 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4b1978_3d21_4daa_89b8_292d5e4cdf9e.slice/crio-4cc98b642f83152d11d4b2c998baa02bd3fca57160638c8f686b2768da4ee984 WatchSource:0}: Error finding container 4cc98b642f83152d11d4b2c998baa02bd3fca57160638c8f686b2768da4ee984: Status 404 returned error can't find the container with id 4cc98b642f83152d11d4b2c998baa02bd3fca57160638c8f686b2768da4ee984 Nov 29 02:45:21 crc kubenswrapper[4749]: I1129 02:45:21.478711 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-567fc5ff4d-d2cnq"] Nov 29 02:45:22 crc kubenswrapper[4749]: I1129 02:45:22.053264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567fc5ff4d-d2cnq" event={"ID":"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e","Type":"ContainerStarted","Data":"c39a40270555978d5cbcddfe8532b5aa0fdbd795db49e394a81657b30ac410cf"} Nov 29 02:45:22 crc kubenswrapper[4749]: I1129 02:45:22.054030 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567fc5ff4d-d2cnq" event={"ID":"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e","Type":"ContainerStarted","Data":"4cc98b642f83152d11d4b2c998baa02bd3fca57160638c8f686b2768da4ee984"} Nov 29 02:45:23 crc kubenswrapper[4749]: I1129 02:45:23.064418 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-567fc5ff4d-d2cnq" event={"ID":"bd4b1978-3d21-4daa-89b8-292d5e4cdf9e","Type":"ContainerStarted","Data":"d225cb9744e973e43af766b0c398ae9535683c5473db420cf45d130499c9c1f2"} Nov 29 02:45:23 crc kubenswrapper[4749]: I1129 02:45:23.064546 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:23 crc kubenswrapper[4749]: I1129 02:45:23.064590 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:23 crc kubenswrapper[4749]: I1129 02:45:23.090263 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-567fc5ff4d-d2cnq" podStartSLOduration=3.0902457 podStartE2EDuration="3.0902457s" podCreationTimestamp="2025-11-29 02:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:45:23.084759247 +0000 UTC m=+5666.256909114" watchObservedRunningTime="2025-11-29 02:45:23.0902457 +0000 UTC m=+5666.262395557" Nov 29 02:45:23 crc kubenswrapper[4749]: I1129 02:45:23.510216 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:23 crc kubenswrapper[4749]: I1129 02:45:23.510533 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:23 crc kubenswrapper[4749]: I1129 02:45:23.560938 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:24 crc kubenswrapper[4749]: I1129 02:45:24.157617 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:24 crc kubenswrapper[4749]: I1129 02:45:24.237749 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llx8s"] Nov 29 02:45:25 crc kubenswrapper[4749]: I1129 02:45:25.374263 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:45:25 crc kubenswrapper[4749]: I1129 02:45:25.374341 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:45:25 crc kubenswrapper[4749]: I1129 02:45:25.717479 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:45:25 crc kubenswrapper[4749]: I1129 02:45:25.807399 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58c6cd76c9-m22r4"] Nov 29 02:45:25 crc kubenswrapper[4749]: I1129 02:45:25.807611 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" podUID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" containerName="dnsmasq-dns" containerID="cri-o://7ea7f37d0c27e25a7be2387221d54c678f088f26f5d12509f20db6e5450f16f9" gracePeriod=10 Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.100783 4749 generic.go:334] "Generic (PLEG): container finished" podID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" containerID="7ea7f37d0c27e25a7be2387221d54c678f088f26f5d12509f20db6e5450f16f9" exitCode=0 Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.101334 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-llx8s" podUID="2514916d-93e9-4c79-a031-07c466477796" containerName="registry-server" containerID="cri-o://81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b" gracePeriod=2 Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.101624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" event={"ID":"24fb78bb-6d28-4a42-ba1a-68342ec9b88d","Type":"ContainerDied","Data":"7ea7f37d0c27e25a7be2387221d54c678f088f26f5d12509f20db6e5450f16f9"} Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.305312 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.364729 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-dns-svc\") pod \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.364830 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgdtn\" (UniqueName: \"kubernetes.io/projected/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-kube-api-access-vgdtn\") pod \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.365042 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-nb\") pod \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.365073 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-config\") pod \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.365109 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-sb\") pod \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\" (UID: \"24fb78bb-6d28-4a42-ba1a-68342ec9b88d\") " Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.390953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-kube-api-access-vgdtn" (OuterVolumeSpecName: "kube-api-access-vgdtn") pod "24fb78bb-6d28-4a42-ba1a-68342ec9b88d" (UID: "24fb78bb-6d28-4a42-ba1a-68342ec9b88d"). InnerVolumeSpecName "kube-api-access-vgdtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.430573 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24fb78bb-6d28-4a42-ba1a-68342ec9b88d" (UID: "24fb78bb-6d28-4a42-ba1a-68342ec9b88d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.433666 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24fb78bb-6d28-4a42-ba1a-68342ec9b88d" (UID: "24fb78bb-6d28-4a42-ba1a-68342ec9b88d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.443071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-config" (OuterVolumeSpecName: "config") pod "24fb78bb-6d28-4a42-ba1a-68342ec9b88d" (UID: "24fb78bb-6d28-4a42-ba1a-68342ec9b88d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.466133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24fb78bb-6d28-4a42-ba1a-68342ec9b88d" (UID: "24fb78bb-6d28-4a42-ba1a-68342ec9b88d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.467655 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.467680 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgdtn\" (UniqueName: \"kubernetes.io/projected/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-kube-api-access-vgdtn\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.467696 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.467709 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.467720 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24fb78bb-6d28-4a42-ba1a-68342ec9b88d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.562108 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.670258 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvn4x\" (UniqueName: \"kubernetes.io/projected/2514916d-93e9-4c79-a031-07c466477796-kube-api-access-fvn4x\") pod \"2514916d-93e9-4c79-a031-07c466477796\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.670411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-catalog-content\") pod \"2514916d-93e9-4c79-a031-07c466477796\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.670449 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-utilities\") pod \"2514916d-93e9-4c79-a031-07c466477796\" (UID: \"2514916d-93e9-4c79-a031-07c466477796\") " Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.671456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-utilities" (OuterVolumeSpecName: "utilities") pod "2514916d-93e9-4c79-a031-07c466477796" (UID: "2514916d-93e9-4c79-a031-07c466477796"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.674024 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2514916d-93e9-4c79-a031-07c466477796-kube-api-access-fvn4x" (OuterVolumeSpecName: "kube-api-access-fvn4x") pod "2514916d-93e9-4c79-a031-07c466477796" (UID: "2514916d-93e9-4c79-a031-07c466477796"). InnerVolumeSpecName "kube-api-access-fvn4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.718932 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2514916d-93e9-4c79-a031-07c466477796" (UID: "2514916d-93e9-4c79-a031-07c466477796"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.772788 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvn4x\" (UniqueName: \"kubernetes.io/projected/2514916d-93e9-4c79-a031-07c466477796-kube-api-access-fvn4x\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.772820 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:26 crc kubenswrapper[4749]: I1129 02:45:26.772830 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2514916d-93e9-4c79-a031-07c466477796-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.123139 4749 generic.go:334] "Generic (PLEG): container finished" podID="2514916d-93e9-4c79-a031-07c466477796" containerID="81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b" exitCode=0 Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.123307 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llx8s" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.123320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llx8s" event={"ID":"2514916d-93e9-4c79-a031-07c466477796","Type":"ContainerDied","Data":"81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b"} Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.123634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llx8s" event={"ID":"2514916d-93e9-4c79-a031-07c466477796","Type":"ContainerDied","Data":"e30ff8e3824235c62df749f0db51c423e67b7a6335752ec6efdca495dbba3e02"} Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.123746 4749 scope.go:117] "RemoveContainer" containerID="81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.128164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" event={"ID":"24fb78bb-6d28-4a42-ba1a-68342ec9b88d","Type":"ContainerDied","Data":"ae150141fb1d47ff1adbac4b7de836a906f91ad259fb97f15a61c3764fe483d1"} Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.128380 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58c6cd76c9-m22r4" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.178562 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llx8s"] Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.179575 4749 scope.go:117] "RemoveContainer" containerID="1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.191368 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-llx8s"] Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.201156 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58c6cd76c9-m22r4"] Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.209691 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58c6cd76c9-m22r4"] Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.221483 4749 scope.go:117] "RemoveContainer" containerID="736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.270518 4749 scope.go:117] "RemoveContainer" containerID="81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b" Nov 29 02:45:27 crc kubenswrapper[4749]: E1129 02:45:27.271082 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b\": container with ID starting with 81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b not found: ID does not exist" containerID="81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.271269 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b"} err="failed to get container status \"81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b\": rpc error: code = NotFound desc = could not find container \"81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b\": container with ID starting with 81c4ff9ecfabc66ad1942c2f837d4a97fda795de0bbd459dda5dac7c32d0b35b not found: ID does not exist" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.271421 4749 scope.go:117] "RemoveContainer" containerID="1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d" Nov 29 02:45:27 crc kubenswrapper[4749]: E1129 02:45:27.272103 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d\": container with ID starting with 1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d not found: ID does not exist" containerID="1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.272145 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d"} err="failed to get container status \"1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d\": rpc error: code = NotFound desc = could not find container \"1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d\": container with ID starting with 1de43fa79a368d60a0b5f99aca3d69113d09e01897b9d0ce050549cf6e9d662d not found: ID does not exist" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.272177 4749 scope.go:117] "RemoveContainer" containerID="736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6" Nov 29 02:45:27 crc kubenswrapper[4749]: E1129 02:45:27.272535 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6\": container with ID starting with 736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6 not found: ID does not exist" containerID="736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.272695 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6"} err="failed to get container status \"736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6\": rpc error: code = NotFound desc = could not find container \"736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6\": container with ID starting with 736e77040a9f47b1811c6faea618a4168eff7db0554fb67de98e260b2e689ad6 not found: ID does not exist" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.272738 4749 scope.go:117] "RemoveContainer" containerID="7ea7f37d0c27e25a7be2387221d54c678f088f26f5d12509f20db6e5450f16f9" Nov 29 02:45:27 crc kubenswrapper[4749]: I1129 02:45:27.308718 4749 scope.go:117] "RemoveContainer" containerID="c9c6d691a73ab05bbac6e567757d1b6693716804fb7141c5d76a9746938a3743" Nov 29 02:45:29 crc kubenswrapper[4749]: I1129 02:45:29.092391 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" path="/var/lib/kubelet/pods/24fb78bb-6d28-4a42-ba1a-68342ec9b88d/volumes" Nov 29 02:45:29 crc kubenswrapper[4749]: I1129 02:45:29.093977 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2514916d-93e9-4c79-a031-07c466477796" path="/var/lib/kubelet/pods/2514916d-93e9-4c79-a031-07c466477796/volumes" Nov 29 02:45:52 crc kubenswrapper[4749]: I1129 02:45:52.030449 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:52 crc kubenswrapper[4749]: I1129 02:45:52.031289 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-567fc5ff4d-d2cnq" Nov 29 02:45:55 crc kubenswrapper[4749]: I1129 02:45:55.374059 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:45:55 crc kubenswrapper[4749]: I1129 02:45:55.374832 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:45:55 crc kubenswrapper[4749]: I1129 02:45:55.374891 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:45:55 crc kubenswrapper[4749]: I1129 02:45:55.375850 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:45:55 crc kubenswrapper[4749]: I1129 02:45:55.375950 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" gracePeriod=600 Nov 29 02:45:55 crc kubenswrapper[4749]: E1129 02:45:55.520457 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:45:56 crc kubenswrapper[4749]: I1129 02:45:56.465163 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" exitCode=0 Nov 29 02:45:56 crc kubenswrapper[4749]: I1129 02:45:56.465250 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd"} Nov 29 02:45:56 crc kubenswrapper[4749]: I1129 02:45:56.465305 4749 scope.go:117] "RemoveContainer" containerID="0f517e2447b6626d440180682d23cb7d40e59343ddf40e5b871367149339d323" Nov 29 02:45:56 crc kubenswrapper[4749]: I1129 02:45:56.467479 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:45:56 crc kubenswrapper[4749]: E1129 02:45:56.472576 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:46:08 crc kubenswrapper[4749]: I1129 02:46:08.075468 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:46:08 crc kubenswrapper[4749]: E1129 02:46:08.076396 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.462331 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-8twmk"] Nov 29 02:46:13 crc kubenswrapper[4749]: E1129 02:46:13.464751 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2514916d-93e9-4c79-a031-07c466477796" containerName="registry-server" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.464771 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2514916d-93e9-4c79-a031-07c466477796" containerName="registry-server" Nov 29 02:46:13 crc kubenswrapper[4749]: E1129 02:46:13.464786 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" containerName="init" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.464791 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" containerName="init" Nov 29 02:46:13 crc kubenswrapper[4749]: E1129 02:46:13.464814 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2514916d-93e9-4c79-a031-07c466477796" containerName="extract-utilities" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.464820 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2514916d-93e9-4c79-a031-07c466477796" containerName="extract-utilities" Nov 29 02:46:13 crc kubenswrapper[4749]: E1129 02:46:13.464832 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2514916d-93e9-4c79-a031-07c466477796" containerName="extract-content" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.464837 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2514916d-93e9-4c79-a031-07c466477796" containerName="extract-content" Nov 29 02:46:13 crc kubenswrapper[4749]: E1129 02:46:13.464853 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" containerName="dnsmasq-dns" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.464859 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" containerName="dnsmasq-dns" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.465020 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2514916d-93e9-4c79-a031-07c466477796" containerName="registry-server" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.465033 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="24fb78bb-6d28-4a42-ba1a-68342ec9b88d" containerName="dnsmasq-dns" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.465607 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.481552 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8twmk"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.550248 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hgh7f"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.551395 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.558305 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hgh7f"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.615625 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xvq5\" (UniqueName: \"kubernetes.io/projected/5915b3e3-1865-484d-88aa-54633385cf59-kube-api-access-9xvq5\") pod \"nova-api-db-create-8twmk\" (UID: \"5915b3e3-1865-484d-88aa-54633385cf59\") " pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.615677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5915b3e3-1865-484d-88aa-54633385cf59-operator-scripts\") pod \"nova-api-db-create-8twmk\" (UID: \"5915b3e3-1865-484d-88aa-54633385cf59\") " pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.659128 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-rs5r5"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.660324 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.676915 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5140-account-create-update-xjmf2"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.678696 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.680732 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.685033 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rs5r5"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.706235 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5140-account-create-update-xjmf2"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.717296 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkkgm\" (UniqueName: \"kubernetes.io/projected/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-kube-api-access-wkkgm\") pod \"nova-cell0-db-create-hgh7f\" (UID: \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\") " pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.717363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xvq5\" (UniqueName: \"kubernetes.io/projected/5915b3e3-1865-484d-88aa-54633385cf59-kube-api-access-9xvq5\") pod \"nova-api-db-create-8twmk\" (UID: \"5915b3e3-1865-484d-88aa-54633385cf59\") " pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.717536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5915b3e3-1865-484d-88aa-54633385cf59-operator-scripts\") pod \"nova-api-db-create-8twmk\" (UID: \"5915b3e3-1865-484d-88aa-54633385cf59\") " pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.717678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-operator-scripts\") pod \"nova-cell0-db-create-hgh7f\" (UID: \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\") " pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.722068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5915b3e3-1865-484d-88aa-54633385cf59-operator-scripts\") pod \"nova-api-db-create-8twmk\" (UID: \"5915b3e3-1865-484d-88aa-54633385cf59\") " pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.744909 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xvq5\" (UniqueName: \"kubernetes.io/projected/5915b3e3-1865-484d-88aa-54633385cf59-kube-api-access-9xvq5\") pod \"nova-api-db-create-8twmk\" (UID: \"5915b3e3-1865-484d-88aa-54633385cf59\") " pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.793932 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.819055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkkgm\" (UniqueName: \"kubernetes.io/projected/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-kube-api-access-wkkgm\") pod \"nova-cell0-db-create-hgh7f\" (UID: \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\") " pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.819114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl2rf\" (UniqueName: \"kubernetes.io/projected/336fb058-ef0f-445c-b051-d08c95431b6b-kube-api-access-pl2rf\") pod \"nova-cell1-db-create-rs5r5\" (UID: \"336fb058-ef0f-445c-b051-d08c95431b6b\") " pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.819173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-operator-scripts\") pod \"nova-cell0-db-create-hgh7f\" (UID: \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\") " pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.819210 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90f876a-7350-43b0-ad17-c0579cbf013d-operator-scripts\") pod \"nova-api-5140-account-create-update-xjmf2\" (UID: \"c90f876a-7350-43b0-ad17-c0579cbf013d\") " pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.819235 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgww\" (UniqueName: \"kubernetes.io/projected/c90f876a-7350-43b0-ad17-c0579cbf013d-kube-api-access-qvgww\") pod \"nova-api-5140-account-create-update-xjmf2\" (UID: \"c90f876a-7350-43b0-ad17-c0579cbf013d\") " pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.819262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/336fb058-ef0f-445c-b051-d08c95431b6b-operator-scripts\") pod \"nova-cell1-db-create-rs5r5\" (UID: \"336fb058-ef0f-445c-b051-d08c95431b6b\") " pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.820147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-operator-scripts\") pod \"nova-cell0-db-create-hgh7f\" (UID: \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\") " pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.844277 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkkgm\" (UniqueName: \"kubernetes.io/projected/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-kube-api-access-wkkgm\") pod \"nova-cell0-db-create-hgh7f\" (UID: \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\") " pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.863353 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fab4-account-create-update-vn8tp"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.864440 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.866970 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.867384 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.869747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fab4-account-create-update-vn8tp"] Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.920313 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl2rf\" (UniqueName: \"kubernetes.io/projected/336fb058-ef0f-445c-b051-d08c95431b6b-kube-api-access-pl2rf\") pod \"nova-cell1-db-create-rs5r5\" (UID: \"336fb058-ef0f-445c-b051-d08c95431b6b\") " pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.920563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90f876a-7350-43b0-ad17-c0579cbf013d-operator-scripts\") pod \"nova-api-5140-account-create-update-xjmf2\" (UID: \"c90f876a-7350-43b0-ad17-c0579cbf013d\") " pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.920592 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgww\" (UniqueName: \"kubernetes.io/projected/c90f876a-7350-43b0-ad17-c0579cbf013d-kube-api-access-qvgww\") pod \"nova-api-5140-account-create-update-xjmf2\" (UID: \"c90f876a-7350-43b0-ad17-c0579cbf013d\") " pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.920624 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/336fb058-ef0f-445c-b051-d08c95431b6b-operator-scripts\") pod \"nova-cell1-db-create-rs5r5\" (UID: \"336fb058-ef0f-445c-b051-d08c95431b6b\") " pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.921264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/336fb058-ef0f-445c-b051-d08c95431b6b-operator-scripts\") pod \"nova-cell1-db-create-rs5r5\" (UID: \"336fb058-ef0f-445c-b051-d08c95431b6b\") " pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.921523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90f876a-7350-43b0-ad17-c0579cbf013d-operator-scripts\") pod \"nova-api-5140-account-create-update-xjmf2\" (UID: \"c90f876a-7350-43b0-ad17-c0579cbf013d\") " pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.946466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgww\" (UniqueName: \"kubernetes.io/projected/c90f876a-7350-43b0-ad17-c0579cbf013d-kube-api-access-qvgww\") pod \"nova-api-5140-account-create-update-xjmf2\" (UID: \"c90f876a-7350-43b0-ad17-c0579cbf013d\") " pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.951337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl2rf\" (UniqueName: \"kubernetes.io/projected/336fb058-ef0f-445c-b051-d08c95431b6b-kube-api-access-pl2rf\") pod \"nova-cell1-db-create-rs5r5\" (UID: \"336fb058-ef0f-445c-b051-d08c95431b6b\") " pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:13 crc kubenswrapper[4749]: I1129 02:46:13.975065 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.002889 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.021893 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-operator-scripts\") pod \"nova-cell0-fab4-account-create-update-vn8tp\" (UID: \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\") " pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.022097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfdv\" (UniqueName: \"kubernetes.io/projected/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-kube-api-access-ggfdv\") pod \"nova-cell0-fab4-account-create-update-vn8tp\" (UID: \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\") " pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.067645 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-117b-account-create-update-tmlsv"] Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.068957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.071407 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.074807 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-117b-account-create-update-tmlsv"] Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.124562 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0a66458-2d12-4987-b43f-735fe5fb5528-operator-scripts\") pod \"nova-cell1-117b-account-create-update-tmlsv\" (UID: \"f0a66458-2d12-4987-b43f-735fe5fb5528\") " pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.124723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfdv\" (UniqueName: \"kubernetes.io/projected/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-kube-api-access-ggfdv\") pod \"nova-cell0-fab4-account-create-update-vn8tp\" (UID: \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\") " pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.124858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vs9\" (UniqueName: \"kubernetes.io/projected/f0a66458-2d12-4987-b43f-735fe5fb5528-kube-api-access-l5vs9\") pod \"nova-cell1-117b-account-create-update-tmlsv\" (UID: \"f0a66458-2d12-4987-b43f-735fe5fb5528\") " pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.124950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-operator-scripts\") pod \"nova-cell0-fab4-account-create-update-vn8tp\" (UID: \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\") " pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.126083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-operator-scripts\") pod \"nova-cell0-fab4-account-create-update-vn8tp\" (UID: \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\") " pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.142701 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfdv\" (UniqueName: \"kubernetes.io/projected/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-kube-api-access-ggfdv\") pod \"nova-cell0-fab4-account-create-update-vn8tp\" (UID: \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\") " pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.226591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0a66458-2d12-4987-b43f-735fe5fb5528-operator-scripts\") pod \"nova-cell1-117b-account-create-update-tmlsv\" (UID: \"f0a66458-2d12-4987-b43f-735fe5fb5528\") " pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.226711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vs9\" (UniqueName: \"kubernetes.io/projected/f0a66458-2d12-4987-b43f-735fe5fb5528-kube-api-access-l5vs9\") pod \"nova-cell1-117b-account-create-update-tmlsv\" (UID: \"f0a66458-2d12-4987-b43f-735fe5fb5528\") " pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.227670 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0a66458-2d12-4987-b43f-735fe5fb5528-operator-scripts\") pod \"nova-cell1-117b-account-create-update-tmlsv\" (UID: \"f0a66458-2d12-4987-b43f-735fe5fb5528\") " pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.243366 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vs9\" (UniqueName: \"kubernetes.io/projected/f0a66458-2d12-4987-b43f-735fe5fb5528-kube-api-access-l5vs9\") pod \"nova-cell1-117b-account-create-update-tmlsv\" (UID: \"f0a66458-2d12-4987-b43f-735fe5fb5528\") " pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.282134 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.292385 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-8twmk"] Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.419269 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.479703 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hgh7f"] Nov 29 02:46:14 crc kubenswrapper[4749]: W1129 02:46:14.493453 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8efc8ea_5df9_437b_a0cf_c65eb1e6b454.slice/crio-4e6ac27832bec0f94ed65f42548f25fd2db587c84acd3d54d9ee1177e2de7873 WatchSource:0}: Error finding container 4e6ac27832bec0f94ed65f42548f25fd2db587c84acd3d54d9ee1177e2de7873: Status 404 returned error can't find the container with id 4e6ac27832bec0f94ed65f42548f25fd2db587c84acd3d54d9ee1177e2de7873 Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.562634 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5140-account-create-update-xjmf2"] Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.593676 4749 scope.go:117] "RemoveContainer" containerID="24b1bd9958e0b0112f79662b2ec83e0e7b4d598f8f264e5f318f3894619ddbed" Nov 29 02:46:14 crc kubenswrapper[4749]: W1129 02:46:14.593932 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90f876a_7350_43b0_ad17_c0579cbf013d.slice/crio-a8c8ae123430cd7497a0e47a64ad8458a62e832743c62db8969f6b3906434ff0 WatchSource:0}: Error finding container a8c8ae123430cd7497a0e47a64ad8458a62e832743c62db8969f6b3906434ff0: Status 404 returned error can't find the container with id a8c8ae123430cd7497a0e47a64ad8458a62e832743c62db8969f6b3906434ff0 Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.633127 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rs5r5"] Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.673054 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-117b-account-create-update-tmlsv"] Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.692685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hgh7f" event={"ID":"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454","Type":"ContainerStarted","Data":"4e6ac27832bec0f94ed65f42548f25fd2db587c84acd3d54d9ee1177e2de7873"} Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.693689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rs5r5" event={"ID":"336fb058-ef0f-445c-b051-d08c95431b6b","Type":"ContainerStarted","Data":"5f40c4346e836725ba0150abb72b88a95beecbcce1b8b320fc2cbe6f69e7eeb7"} Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.695538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8twmk" event={"ID":"5915b3e3-1865-484d-88aa-54633385cf59","Type":"ContainerStarted","Data":"30c9e3e80aca8a2d9aab37673fe910cc0a3959848920f32cc9409010ffda0b62"} Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.695562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8twmk" event={"ID":"5915b3e3-1865-484d-88aa-54633385cf59","Type":"ContainerStarted","Data":"c4461f00d47ac5f11f2c49f5a5d59da4f8e5f91a9dd812f744bcf33c89d10acd"} Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.697291 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-117b-account-create-update-tmlsv" event={"ID":"f0a66458-2d12-4987-b43f-735fe5fb5528","Type":"ContainerStarted","Data":"b56e17295e10fd06419cbba5b987bd697a0d91abe168bf3d0757c657c7c21286"} Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.698608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5140-account-create-update-xjmf2" event={"ID":"c90f876a-7350-43b0-ad17-c0579cbf013d","Type":"ContainerStarted","Data":"a8c8ae123430cd7497a0e47a64ad8458a62e832743c62db8969f6b3906434ff0"} Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.704696 4749 scope.go:117] "RemoveContainer" containerID="95f8e8edb0d102394907d307dd76ecf034839055ea28647f3ab96c6ca6383bdf" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.726189 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fab4-account-create-update-vn8tp"] Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.766611 4749 scope.go:117] "RemoveContainer" containerID="a69b848dea3f1c1e36fffcf38e28d5fccd53eb9e8122ff35c02a5e70e967c431" Nov 29 02:46:14 crc kubenswrapper[4749]: I1129 02:46:14.960580 4749 scope.go:117] "RemoveContainer" containerID="c303996bef9b52e916bc3ddc08bb69ff1013824ddd651abe0cea0b511b8436e9" Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.711877 4749 generic.go:334] "Generic (PLEG): container finished" podID="b9eeacbb-6c08-475f-812d-bc6c94c22fe6" containerID="14b4d2fe98f5cef9a7413baeb1ee82059abf4722439711d9d1988bd31c44fe1e" exitCode=0 Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.711985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" event={"ID":"b9eeacbb-6c08-475f-812d-bc6c94c22fe6","Type":"ContainerDied","Data":"14b4d2fe98f5cef9a7413baeb1ee82059abf4722439711d9d1988bd31c44fe1e"} Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.712432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" event={"ID":"b9eeacbb-6c08-475f-812d-bc6c94c22fe6","Type":"ContainerStarted","Data":"46cf15ea6578875747115c97ab6f127d946df4e0ca5e90b4c74b619a93f6e4ed"} Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.716280 4749 generic.go:334] "Generic (PLEG): container finished" podID="336fb058-ef0f-445c-b051-d08c95431b6b" containerID="67af94c1d6da024cebfe8f79403c4f2253203067dfb763245b8e322eda8571d1" exitCode=0 Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.716351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rs5r5" event={"ID":"336fb058-ef0f-445c-b051-d08c95431b6b","Type":"ContainerDied","Data":"67af94c1d6da024cebfe8f79403c4f2253203067dfb763245b8e322eda8571d1"} Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.718855 4749 generic.go:334] "Generic (PLEG): container finished" podID="5915b3e3-1865-484d-88aa-54633385cf59" containerID="30c9e3e80aca8a2d9aab37673fe910cc0a3959848920f32cc9409010ffda0b62" exitCode=0 Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.718928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8twmk" event={"ID":"5915b3e3-1865-484d-88aa-54633385cf59","Type":"ContainerDied","Data":"30c9e3e80aca8a2d9aab37673fe910cc0a3959848920f32cc9409010ffda0b62"} Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.720612 4749 generic.go:334] "Generic (PLEG): container finished" podID="c90f876a-7350-43b0-ad17-c0579cbf013d" containerID="5018f581bbf19aeca63686918b4973bfb894222ff0095532434bb8e4451105d9" exitCode=0 Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.720687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5140-account-create-update-xjmf2" event={"ID":"c90f876a-7350-43b0-ad17-c0579cbf013d","Type":"ContainerDied","Data":"5018f581bbf19aeca63686918b4973bfb894222ff0095532434bb8e4451105d9"} Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.723812 4749 generic.go:334] "Generic (PLEG): container finished" podID="f0a66458-2d12-4987-b43f-735fe5fb5528" containerID="e15febb8618783e33f4b1b8e1b15ecf071c9e7f8d01ae59128af43c4966a1e7b" exitCode=0 Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.723865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-117b-account-create-update-tmlsv" event={"ID":"f0a66458-2d12-4987-b43f-735fe5fb5528","Type":"ContainerDied","Data":"e15febb8618783e33f4b1b8e1b15ecf071c9e7f8d01ae59128af43c4966a1e7b"} Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.726229 4749 generic.go:334] "Generic (PLEG): container finished" podID="a8efc8ea-5df9-437b-a0cf-c65eb1e6b454" containerID="0081f70d8987acdf116f6c43199fdcdeafe856189fd8154e95fe7e4c85f2f0a8" exitCode=0 Nov 29 02:46:15 crc kubenswrapper[4749]: I1129 02:46:15.726268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hgh7f" event={"ID":"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454","Type":"ContainerDied","Data":"0081f70d8987acdf116f6c43199fdcdeafe856189fd8154e95fe7e4c85f2f0a8"} Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.174758 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.264979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xvq5\" (UniqueName: \"kubernetes.io/projected/5915b3e3-1865-484d-88aa-54633385cf59-kube-api-access-9xvq5\") pod \"5915b3e3-1865-484d-88aa-54633385cf59\" (UID: \"5915b3e3-1865-484d-88aa-54633385cf59\") " Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.265027 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5915b3e3-1865-484d-88aa-54633385cf59-operator-scripts\") pod \"5915b3e3-1865-484d-88aa-54633385cf59\" (UID: \"5915b3e3-1865-484d-88aa-54633385cf59\") " Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.265812 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5915b3e3-1865-484d-88aa-54633385cf59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5915b3e3-1865-484d-88aa-54633385cf59" (UID: "5915b3e3-1865-484d-88aa-54633385cf59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.273784 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5915b3e3-1865-484d-88aa-54633385cf59-kube-api-access-9xvq5" (OuterVolumeSpecName: "kube-api-access-9xvq5") pod "5915b3e3-1865-484d-88aa-54633385cf59" (UID: "5915b3e3-1865-484d-88aa-54633385cf59"). InnerVolumeSpecName "kube-api-access-9xvq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.370326 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xvq5\" (UniqueName: \"kubernetes.io/projected/5915b3e3-1865-484d-88aa-54633385cf59-kube-api-access-9xvq5\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.370357 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5915b3e3-1865-484d-88aa-54633385cf59-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.740394 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-8twmk" Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.740406 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-8twmk" event={"ID":"5915b3e3-1865-484d-88aa-54633385cf59","Type":"ContainerDied","Data":"c4461f00d47ac5f11f2c49f5a5d59da4f8e5f91a9dd812f744bcf33c89d10acd"} Nov 29 02:46:16 crc kubenswrapper[4749]: I1129 02:46:16.740449 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4461f00d47ac5f11f2c49f5a5d59da4f8e5f91a9dd812f744bcf33c89d10acd" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.150899 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.278964 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.285080 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.291099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-operator-scripts\") pod \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\" (UID: \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.291217 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkkgm\" (UniqueName: \"kubernetes.io/projected/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-kube-api-access-wkkgm\") pod \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\" (UID: \"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.292244 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8efc8ea-5df9-437b-a0cf-c65eb1e6b454" (UID: "a8efc8ea-5df9-437b-a0cf-c65eb1e6b454"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.293957 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.308558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-kube-api-access-wkkgm" (OuterVolumeSpecName: "kube-api-access-wkkgm") pod "a8efc8ea-5df9-437b-a0cf-c65eb1e6b454" (UID: "a8efc8ea-5df9-437b-a0cf-c65eb1e6b454"). InnerVolumeSpecName "kube-api-access-wkkgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.309427 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.392266 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvgww\" (UniqueName: \"kubernetes.io/projected/c90f876a-7350-43b0-ad17-c0579cbf013d-kube-api-access-qvgww\") pod \"c90f876a-7350-43b0-ad17-c0579cbf013d\" (UID: \"c90f876a-7350-43b0-ad17-c0579cbf013d\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.392363 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0a66458-2d12-4987-b43f-735fe5fb5528-operator-scripts\") pod \"f0a66458-2d12-4987-b43f-735fe5fb5528\" (UID: \"f0a66458-2d12-4987-b43f-735fe5fb5528\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.392405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl2rf\" (UniqueName: \"kubernetes.io/projected/336fb058-ef0f-445c-b051-d08c95431b6b-kube-api-access-pl2rf\") pod \"336fb058-ef0f-445c-b051-d08c95431b6b\" (UID: \"336fb058-ef0f-445c-b051-d08c95431b6b\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.392454 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5vs9\" (UniqueName: \"kubernetes.io/projected/f0a66458-2d12-4987-b43f-735fe5fb5528-kube-api-access-l5vs9\") pod \"f0a66458-2d12-4987-b43f-735fe5fb5528\" (UID: \"f0a66458-2d12-4987-b43f-735fe5fb5528\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.392511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-operator-scripts\") pod \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\" (UID: \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.392539 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggfdv\" (UniqueName: \"kubernetes.io/projected/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-kube-api-access-ggfdv\") pod \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\" (UID: \"b9eeacbb-6c08-475f-812d-bc6c94c22fe6\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.392608 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/336fb058-ef0f-445c-b051-d08c95431b6b-operator-scripts\") pod \"336fb058-ef0f-445c-b051-d08c95431b6b\" (UID: \"336fb058-ef0f-445c-b051-d08c95431b6b\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.392654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90f876a-7350-43b0-ad17-c0579cbf013d-operator-scripts\") pod \"c90f876a-7350-43b0-ad17-c0579cbf013d\" (UID: \"c90f876a-7350-43b0-ad17-c0579cbf013d\") " Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.393124 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkkgm\" (UniqueName: \"kubernetes.io/projected/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-kube-api-access-wkkgm\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.393219 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.394059 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/336fb058-ef0f-445c-b051-d08c95431b6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "336fb058-ef0f-445c-b051-d08c95431b6b" (UID: "336fb058-ef0f-445c-b051-d08c95431b6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.394295 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90f876a-7350-43b0-ad17-c0579cbf013d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c90f876a-7350-43b0-ad17-c0579cbf013d" (UID: "c90f876a-7350-43b0-ad17-c0579cbf013d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.394381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9eeacbb-6c08-475f-812d-bc6c94c22fe6" (UID: "b9eeacbb-6c08-475f-812d-bc6c94c22fe6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.395221 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0a66458-2d12-4987-b43f-735fe5fb5528-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0a66458-2d12-4987-b43f-735fe5fb5528" (UID: "f0a66458-2d12-4987-b43f-735fe5fb5528"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.396785 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-kube-api-access-ggfdv" (OuterVolumeSpecName: "kube-api-access-ggfdv") pod "b9eeacbb-6c08-475f-812d-bc6c94c22fe6" (UID: "b9eeacbb-6c08-475f-812d-bc6c94c22fe6"). InnerVolumeSpecName "kube-api-access-ggfdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.397176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336fb058-ef0f-445c-b051-d08c95431b6b-kube-api-access-pl2rf" (OuterVolumeSpecName: "kube-api-access-pl2rf") pod "336fb058-ef0f-445c-b051-d08c95431b6b" (UID: "336fb058-ef0f-445c-b051-d08c95431b6b"). InnerVolumeSpecName "kube-api-access-pl2rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.398092 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0a66458-2d12-4987-b43f-735fe5fb5528-kube-api-access-l5vs9" (OuterVolumeSpecName: "kube-api-access-l5vs9") pod "f0a66458-2d12-4987-b43f-735fe5fb5528" (UID: "f0a66458-2d12-4987-b43f-735fe5fb5528"). InnerVolumeSpecName "kube-api-access-l5vs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.398692 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90f876a-7350-43b0-ad17-c0579cbf013d-kube-api-access-qvgww" (OuterVolumeSpecName: "kube-api-access-qvgww") pod "c90f876a-7350-43b0-ad17-c0579cbf013d" (UID: "c90f876a-7350-43b0-ad17-c0579cbf013d"). InnerVolumeSpecName "kube-api-access-qvgww". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.495142 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvgww\" (UniqueName: \"kubernetes.io/projected/c90f876a-7350-43b0-ad17-c0579cbf013d-kube-api-access-qvgww\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.495191 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0a66458-2d12-4987-b43f-735fe5fb5528-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.495229 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl2rf\" (UniqueName: \"kubernetes.io/projected/336fb058-ef0f-445c-b051-d08c95431b6b-kube-api-access-pl2rf\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.495250 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5vs9\" (UniqueName: \"kubernetes.io/projected/f0a66458-2d12-4987-b43f-735fe5fb5528-kube-api-access-l5vs9\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.495271 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.495291 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggfdv\" (UniqueName: \"kubernetes.io/projected/b9eeacbb-6c08-475f-812d-bc6c94c22fe6-kube-api-access-ggfdv\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.495308 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/336fb058-ef0f-445c-b051-d08c95431b6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.495324 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90f876a-7350-43b0-ad17-c0579cbf013d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.754419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hgh7f" event={"ID":"a8efc8ea-5df9-437b-a0cf-c65eb1e6b454","Type":"ContainerDied","Data":"4e6ac27832bec0f94ed65f42548f25fd2db587c84acd3d54d9ee1177e2de7873"} Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.754477 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e6ac27832bec0f94ed65f42548f25fd2db587c84acd3d54d9ee1177e2de7873" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.755377 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hgh7f" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.756853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" event={"ID":"b9eeacbb-6c08-475f-812d-bc6c94c22fe6","Type":"ContainerDied","Data":"46cf15ea6578875747115c97ab6f127d946df4e0ca5e90b4c74b619a93f6e4ed"} Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.756900 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46cf15ea6578875747115c97ab6f127d946df4e0ca5e90b4c74b619a93f6e4ed" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.756906 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fab4-account-create-update-vn8tp" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.759378 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rs5r5" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.760411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rs5r5" event={"ID":"336fb058-ef0f-445c-b051-d08c95431b6b","Type":"ContainerDied","Data":"5f40c4346e836725ba0150abb72b88a95beecbcce1b8b320fc2cbe6f69e7eeb7"} Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.760464 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f40c4346e836725ba0150abb72b88a95beecbcce1b8b320fc2cbe6f69e7eeb7" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.762901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-117b-account-create-update-tmlsv" event={"ID":"f0a66458-2d12-4987-b43f-735fe5fb5528","Type":"ContainerDied","Data":"b56e17295e10fd06419cbba5b987bd697a0d91abe168bf3d0757c657c7c21286"} Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.762947 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56e17295e10fd06419cbba5b987bd697a0d91abe168bf3d0757c657c7c21286" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.763016 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-117b-account-create-update-tmlsv" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.770794 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5140-account-create-update-xjmf2" event={"ID":"c90f876a-7350-43b0-ad17-c0579cbf013d","Type":"ContainerDied","Data":"a8c8ae123430cd7497a0e47a64ad8458a62e832743c62db8969f6b3906434ff0"} Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.770848 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8c8ae123430cd7497a0e47a64ad8458a62e832743c62db8969f6b3906434ff0" Nov 29 02:46:17 crc kubenswrapper[4749]: I1129 02:46:17.770957 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5140-account-create-update-xjmf2" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.098374 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n2nl8"] Nov 29 02:46:19 crc kubenswrapper[4749]: E1129 02:46:19.098648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336fb058-ef0f-445c-b051-d08c95431b6b" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.098660 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="336fb058-ef0f-445c-b051-d08c95431b6b" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: E1129 02:46:19.098675 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5915b3e3-1865-484d-88aa-54633385cf59" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.098682 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5915b3e3-1865-484d-88aa-54633385cf59" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: E1129 02:46:19.098702 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90f876a-7350-43b0-ad17-c0579cbf013d" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.098707 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90f876a-7350-43b0-ad17-c0579cbf013d" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: E1129 02:46:19.098720 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a66458-2d12-4987-b43f-735fe5fb5528" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.098727 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a66458-2d12-4987-b43f-735fe5fb5528" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: E1129 02:46:19.098739 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8efc8ea-5df9-437b-a0cf-c65eb1e6b454" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.098745 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8efc8ea-5df9-437b-a0cf-c65eb1e6b454" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: E1129 02:46:19.098761 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eeacbb-6c08-475f-812d-bc6c94c22fe6" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.098766 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eeacbb-6c08-475f-812d-bc6c94c22fe6" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.111367 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8efc8ea-5df9-437b-a0cf-c65eb1e6b454" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.111421 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="336fb058-ef0f-445c-b051-d08c95431b6b" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.111452 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90f876a-7350-43b0-ad17-c0579cbf013d" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.111468 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a66458-2d12-4987-b43f-735fe5fb5528" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.111484 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5915b3e3-1865-484d-88aa-54633385cf59" containerName="mariadb-database-create" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.111496 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eeacbb-6c08-475f-812d-bc6c94c22fe6" containerName="mariadb-account-create-update" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.112186 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n2nl8"] Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.112299 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.114936 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.115134 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xms8w" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.115302 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.229375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-config-data\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.229426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.229525 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vv27\" (UniqueName: \"kubernetes.io/projected/fbec9973-7a01-4dcd-af32-be7f72b9d461-kube-api-access-2vv27\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.229629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-scripts\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.330944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.331047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vv27\" (UniqueName: \"kubernetes.io/projected/fbec9973-7a01-4dcd-af32-be7f72b9d461-kube-api-access-2vv27\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.331154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-scripts\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.331190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-config-data\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.335666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.342682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-scripts\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.343104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-config-data\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.351453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vv27\" (UniqueName: \"kubernetes.io/projected/fbec9973-7a01-4dcd-af32-be7f72b9d461-kube-api-access-2vv27\") pod \"nova-cell0-conductor-db-sync-n2nl8\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:19 crc kubenswrapper[4749]: I1129 02:46:19.440398 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:20 crc kubenswrapper[4749]: I1129 02:46:20.075830 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:46:20 crc kubenswrapper[4749]: E1129 02:46:20.076311 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:46:20 crc kubenswrapper[4749]: I1129 02:46:20.503871 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n2nl8"] Nov 29 02:46:20 crc kubenswrapper[4749]: I1129 02:46:20.828801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" event={"ID":"fbec9973-7a01-4dcd-af32-be7f72b9d461","Type":"ContainerStarted","Data":"dceaec226c6160677deef66ecbe5eb22dadd413eea3a4f96ca160a2f2d415084"} Nov 29 02:46:20 crc kubenswrapper[4749]: I1129 02:46:20.829147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" event={"ID":"fbec9973-7a01-4dcd-af32-be7f72b9d461","Type":"ContainerStarted","Data":"4e900c69a1873f1276b39961d926a8ce4b4638d76252517b52abe307abeb1852"} Nov 29 02:46:20 crc kubenswrapper[4749]: I1129 02:46:20.866810 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" podStartSLOduration=1.866787339 podStartE2EDuration="1.866787339s" podCreationTimestamp="2025-11-29 02:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:20.853575787 +0000 UTC m=+5724.025725684" watchObservedRunningTime="2025-11-29 02:46:20.866787339 +0000 UTC m=+5724.038937226" Nov 29 02:46:25 crc kubenswrapper[4749]: I1129 02:46:25.894408 4749 generic.go:334] "Generic (PLEG): container finished" podID="fbec9973-7a01-4dcd-af32-be7f72b9d461" containerID="dceaec226c6160677deef66ecbe5eb22dadd413eea3a4f96ca160a2f2d415084" exitCode=0 Nov 29 02:46:25 crc kubenswrapper[4749]: I1129 02:46:25.894597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" event={"ID":"fbec9973-7a01-4dcd-af32-be7f72b9d461","Type":"ContainerDied","Data":"dceaec226c6160677deef66ecbe5eb22dadd413eea3a4f96ca160a2f2d415084"} Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.387892 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.515275 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-scripts\") pod \"fbec9973-7a01-4dcd-af32-be7f72b9d461\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.517137 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-config-data\") pod \"fbec9973-7a01-4dcd-af32-be7f72b9d461\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.517333 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-combined-ca-bundle\") pod \"fbec9973-7a01-4dcd-af32-be7f72b9d461\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.517434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vv27\" (UniqueName: \"kubernetes.io/projected/fbec9973-7a01-4dcd-af32-be7f72b9d461-kube-api-access-2vv27\") pod \"fbec9973-7a01-4dcd-af32-be7f72b9d461\" (UID: \"fbec9973-7a01-4dcd-af32-be7f72b9d461\") " Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.522560 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-scripts" (OuterVolumeSpecName: "scripts") pod "fbec9973-7a01-4dcd-af32-be7f72b9d461" (UID: "fbec9973-7a01-4dcd-af32-be7f72b9d461"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.524412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbec9973-7a01-4dcd-af32-be7f72b9d461-kube-api-access-2vv27" (OuterVolumeSpecName: "kube-api-access-2vv27") pod "fbec9973-7a01-4dcd-af32-be7f72b9d461" (UID: "fbec9973-7a01-4dcd-af32-be7f72b9d461"). InnerVolumeSpecName "kube-api-access-2vv27". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.544433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-config-data" (OuterVolumeSpecName: "config-data") pod "fbec9973-7a01-4dcd-af32-be7f72b9d461" (UID: "fbec9973-7a01-4dcd-af32-be7f72b9d461"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.552559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbec9973-7a01-4dcd-af32-be7f72b9d461" (UID: "fbec9973-7a01-4dcd-af32-be7f72b9d461"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.619892 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.619947 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vv27\" (UniqueName: \"kubernetes.io/projected/fbec9973-7a01-4dcd-af32-be7f72b9d461-kube-api-access-2vv27\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.619969 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.619985 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbec9973-7a01-4dcd-af32-be7f72b9d461-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.919946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" event={"ID":"fbec9973-7a01-4dcd-af32-be7f72b9d461","Type":"ContainerDied","Data":"4e900c69a1873f1276b39961d926a8ce4b4638d76252517b52abe307abeb1852"} Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.920003 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e900c69a1873f1276b39961d926a8ce4b4638d76252517b52abe307abeb1852" Nov 29 02:46:27 crc kubenswrapper[4749]: I1129 02:46:27.920072 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n2nl8" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.010821 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:46:28 crc kubenswrapper[4749]: E1129 02:46:28.011519 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbec9973-7a01-4dcd-af32-be7f72b9d461" containerName="nova-cell0-conductor-db-sync" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.011543 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbec9973-7a01-4dcd-af32-be7f72b9d461" containerName="nova-cell0-conductor-db-sync" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.011763 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbec9973-7a01-4dcd-af32-be7f72b9d461" containerName="nova-cell0-conductor-db-sync" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.012496 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.016402 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xms8w" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.018783 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.028983 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.129544 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.129594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnlxn\" (UniqueName: \"kubernetes.io/projected/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-kube-api-access-jnlxn\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.129691 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.231423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.231585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.231627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnlxn\" (UniqueName: \"kubernetes.io/projected/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-kube-api-access-jnlxn\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.237137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.238012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.248470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnlxn\" (UniqueName: \"kubernetes.io/projected/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-kube-api-access-jnlxn\") pod \"nova-cell0-conductor-0\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.336610 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.865427 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:46:28 crc kubenswrapper[4749]: I1129 02:46:28.929897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b","Type":"ContainerStarted","Data":"e8c9f3e5795767ff07d20f6174545d02aa84f5ca9d14ea5c2eeb32dd3460b547"} Nov 29 02:46:29 crc kubenswrapper[4749]: I1129 02:46:29.942487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b","Type":"ContainerStarted","Data":"6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d"} Nov 29 02:46:29 crc kubenswrapper[4749]: I1129 02:46:29.944442 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:29 crc kubenswrapper[4749]: I1129 02:46:29.971162 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.971115601 podStartE2EDuration="2.971115601s" podCreationTimestamp="2025-11-29 02:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:29.965796172 +0000 UTC m=+5733.137946039" watchObservedRunningTime="2025-11-29 02:46:29.971115601 +0000 UTC m=+5733.143265458" Nov 29 02:46:33 crc kubenswrapper[4749]: I1129 02:46:33.076144 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:46:33 crc kubenswrapper[4749]: E1129 02:46:33.077033 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:46:38 crc kubenswrapper[4749]: I1129 02:46:38.372733 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 29 02:46:38 crc kubenswrapper[4749]: I1129 02:46:38.995130 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ghjjm"] Nov 29 02:46:38 crc kubenswrapper[4749]: I1129 02:46:38.996438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.002152 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.004721 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.007657 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ghjjm"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.107402 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.109370 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.121435 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.133926 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.200650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-config-data\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.200743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqkwv\" (UniqueName: \"kubernetes.io/projected/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-kube-api-access-vqkwv\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.200789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-scripts\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.200921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.234083 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.235444 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.238623 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.275359 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.306130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-config-data\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.306443 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.306481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqkwv\" (UniqueName: \"kubernetes.io/projected/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-kube-api-access-vqkwv\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.306517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-scripts\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.306541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-config-data\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.306596 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmk5\" (UniqueName: \"kubernetes.io/projected/cb7391ad-897b-4359-88df-ace9ae2f0517-kube-api-access-phmk5\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.306644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.318678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.322505 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-config-data\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.326783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-scripts\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.365467 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqkwv\" (UniqueName: \"kubernetes.io/projected/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-kube-api-access-vqkwv\") pod \"nova-cell0-cell-mapping-ghjjm\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.372032 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.373653 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.378696 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.390474 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.408294 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.408357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.408388 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0db52c-f857-49b8-81f8-adacdf9791d7-logs\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.408419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-config-data\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.408463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmk5\" (UniqueName: \"kubernetes.io/projected/cb7391ad-897b-4359-88df-ace9ae2f0517-kube-api-access-phmk5\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.408499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-config-data\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.408514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpppq\" (UniqueName: \"kubernetes.io/projected/5a0db52c-f857-49b8-81f8-adacdf9791d7-kube-api-access-wpppq\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.415864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.432435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-config-data\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.432530 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.435762 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.438800 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.441722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmk5\" (UniqueName: \"kubernetes.io/projected/cb7391ad-897b-4359-88df-ace9ae2f0517-kube-api-access-phmk5\") pod \"nova-scheduler-0\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.481514 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.494391 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb9ccdbc-x84nj"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.495993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.502462 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb9ccdbc-x84nj"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.509735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0db52c-f857-49b8-81f8-adacdf9791d7-logs\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.509821 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f55b697-cee2-42dd-bcb8-898283e0da54-logs\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.509871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.509963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpppq\" (UniqueName: \"kubernetes.io/projected/5a0db52c-f857-49b8-81f8-adacdf9791d7-kube-api-access-wpppq\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.509980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-config-data\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.510017 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-config-data\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.510036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z46vt\" (UniqueName: \"kubernetes.io/projected/7f55b697-cee2-42dd-bcb8-898283e0da54-kube-api-access-z46vt\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.510058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.510144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0db52c-f857-49b8-81f8-adacdf9791d7-logs\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.514220 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-config-data\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.518792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.531375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpppq\" (UniqueName: \"kubernetes.io/projected/5a0db52c-f857-49b8-81f8-adacdf9791d7-kube-api-access-wpppq\") pod \"nova-api-0\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.549478 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-dns-svc\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmhbq\" (UniqueName: \"kubernetes.io/projected/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-kube-api-access-pmhbq\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611675 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611720 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-config-data\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z46vt\" (UniqueName: \"kubernetes.io/projected/7f55b697-cee2-42dd-bcb8-898283e0da54-kube-api-access-z46vt\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd42c\" (UniqueName: \"kubernetes.io/projected/aaa73ce0-0bac-482b-9076-cc55df3efd74-kube-api-access-kd42c\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.611959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.613834 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-config\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.613902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f55b697-cee2-42dd-bcb8-898283e0da54-logs\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.614703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f55b697-cee2-42dd-bcb8-898283e0da54-logs\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.616260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-config-data\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.616515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.624781 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.629168 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z46vt\" (UniqueName: \"kubernetes.io/projected/7f55b697-cee2-42dd-bcb8-898283e0da54-kube-api-access-z46vt\") pod \"nova-metadata-0\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.715361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.715821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd42c\" (UniqueName: \"kubernetes.io/projected/aaa73ce0-0bac-482b-9076-cc55df3efd74-kube-api-access-kd42c\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.715852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.715899 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-config\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.715937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-dns-svc\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.715965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.715988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmhbq\" (UniqueName: \"kubernetes.io/projected/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-kube-api-access-pmhbq\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.716028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.716870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.717037 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-dns-svc\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.717526 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.717573 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-config\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.719916 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.720006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.731430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmhbq\" (UniqueName: \"kubernetes.io/projected/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-kube-api-access-pmhbq\") pod \"nova-cell1-novncproxy-0\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.740085 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd42c\" (UniqueName: \"kubernetes.io/projected/aaa73ce0-0bac-482b-9076-cc55df3efd74-kube-api-access-kd42c\") pod \"dnsmasq-dns-7cb9ccdbc-x84nj\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.740273 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.797646 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.823414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.833127 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.968277 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cmpmq"] Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.969745 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.974563 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.974581 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 29 02:46:39 crc kubenswrapper[4749]: I1129 02:46:39.976010 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cmpmq"] Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.025578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-config-data\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.025665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cz7x\" (UniqueName: \"kubernetes.io/projected/d6410f87-bd21-41e5-8f40-f7a9ea97da54-kube-api-access-8cz7x\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.025694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-scripts\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.025840 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: W1129 02:46:40.056821 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a0db52c_f857_49b8_81f8_adacdf9791d7.slice/crio-e049b62c2faa7e5db8e19da786b1afdd2ddc7ac25ae0809416673d265fbdd30a WatchSource:0}: Error finding container e049b62c2faa7e5db8e19da786b1afdd2ddc7ac25ae0809416673d265fbdd30a: Status 404 returned error can't find the container with id e049b62c2faa7e5db8e19da786b1afdd2ddc7ac25ae0809416673d265fbdd30a Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.060660 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.127110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-config-data\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.127165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cz7x\" (UniqueName: \"kubernetes.io/projected/d6410f87-bd21-41e5-8f40-f7a9ea97da54-kube-api-access-8cz7x\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.127186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-scripts\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.127278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.132761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.133077 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-config-data\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.133411 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-scripts\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.142005 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cz7x\" (UniqueName: \"kubernetes.io/projected/d6410f87-bd21-41e5-8f40-f7a9ea97da54-kube-api-access-8cz7x\") pod \"nova-cell1-conductor-db-sync-cmpmq\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.171827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ghjjm"] Nov 29 02:46:40 crc kubenswrapper[4749]: W1129 02:46:40.174191 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffc5d1a2_4a00_4d6c_b8ed_4672d67ec7f1.slice/crio-64ce22c1281adf6c5def05db5bde91426142e1b069814039a8560a64374be74b WatchSource:0}: Error finding container 64ce22c1281adf6c5def05db5bde91426142e1b069814039a8560a64374be74b: Status 404 returned error can't find the container with id 64ce22c1281adf6c5def05db5bde91426142e1b069814039a8560a64374be74b Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.299824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.320888 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:40 crc kubenswrapper[4749]: W1129 02:46:40.331293 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb7391ad_897b_4359_88df_ace9ae2f0517.slice/crio-1ca8e828303828ad5de75abec36dc1683793ce4cdbbc69bf060d112715837be6 WatchSource:0}: Error finding container 1ca8e828303828ad5de75abec36dc1683793ce4cdbbc69bf060d112715837be6: Status 404 returned error can't find the container with id 1ca8e828303828ad5de75abec36dc1683793ce4cdbbc69bf060d112715837be6 Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.409032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb9ccdbc-x84nj"] Nov 29 02:46:40 crc kubenswrapper[4749]: W1129 02:46:40.419369 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f55b697_cee2_42dd_bcb8_898283e0da54.slice/crio-ec264bd3024ad4cdbc85b0f522e975c41f27a4b08808ccdd5bb9d787a9ae6027 WatchSource:0}: Error finding container ec264bd3024ad4cdbc85b0f522e975c41f27a4b08808ccdd5bb9d787a9ae6027: Status 404 returned error can't find the container with id ec264bd3024ad4cdbc85b0f522e975c41f27a4b08808ccdd5bb9d787a9ae6027 Nov 29 02:46:40 crc kubenswrapper[4749]: W1129 02:46:40.445352 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dd1cf26_1b5c_4f49_9790_f87d30cf5c4e.slice/crio-20f21579316b6259bdf86ebadbf209b4f22e9ca48fd9a7fb21d3fc567f4a4da2 WatchSource:0}: Error finding container 20f21579316b6259bdf86ebadbf209b4f22e9ca48fd9a7fb21d3fc567f4a4da2: Status 404 returned error can't find the container with id 20f21579316b6259bdf86ebadbf209b4f22e9ca48fd9a7fb21d3fc567f4a4da2 Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.449879 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.460433 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:46:40 crc kubenswrapper[4749]: I1129 02:46:40.866714 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cmpmq"] Nov 29 02:46:40 crc kubenswrapper[4749]: W1129 02:46:40.870023 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6410f87_bd21_41e5_8f40_f7a9ea97da54.slice/crio-108493fce7b7710f83ccef74bd8b3bbb72784790c5cda3c25ae27a0ab0342411 WatchSource:0}: Error finding container 108493fce7b7710f83ccef74bd8b3bbb72784790c5cda3c25ae27a0ab0342411: Status 404 returned error can't find the container with id 108493fce7b7710f83ccef74bd8b3bbb72784790c5cda3c25ae27a0ab0342411 Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.063768 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e","Type":"ContainerStarted","Data":"b48ace4a6cdadc6166c13d965648a3a6d29494cc118013971549e3d2de5decb4"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.064115 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e","Type":"ContainerStarted","Data":"20f21579316b6259bdf86ebadbf209b4f22e9ca48fd9a7fb21d3fc567f4a4da2"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.065176 4749 generic.go:334] "Generic (PLEG): container finished" podID="aaa73ce0-0bac-482b-9076-cc55df3efd74" containerID="2f717d614f2651123b57d772dc39cad53b029c47f96543eb5859af336854dcf1" exitCode=0 Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.065237 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" event={"ID":"aaa73ce0-0bac-482b-9076-cc55df3efd74","Type":"ContainerDied","Data":"2f717d614f2651123b57d772dc39cad53b029c47f96543eb5859af336854dcf1"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.065252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" event={"ID":"aaa73ce0-0bac-482b-9076-cc55df3efd74","Type":"ContainerStarted","Data":"c185342d86f829890b2427ece89394cabca6b45b7fdf5544382636255474b67d"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.068718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ghjjm" event={"ID":"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1","Type":"ContainerStarted","Data":"49b6b7ed109ea5fe37214b44cce0145d27b1102cadf4c143b3dcb6ef7c389331"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.068828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ghjjm" event={"ID":"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1","Type":"ContainerStarted","Data":"64ce22c1281adf6c5def05db5bde91426142e1b069814039a8560a64374be74b"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.087992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f55b697-cee2-42dd-bcb8-898283e0da54","Type":"ContainerStarted","Data":"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.088607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f55b697-cee2-42dd-bcb8-898283e0da54","Type":"ContainerStarted","Data":"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.088706 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f55b697-cee2-42dd-bcb8-898283e0da54","Type":"ContainerStarted","Data":"ec264bd3024ad4cdbc85b0f522e975c41f27a4b08808ccdd5bb9d787a9ae6027"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.090605 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb7391ad-897b-4359-88df-ace9ae2f0517","Type":"ContainerStarted","Data":"37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.090641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb7391ad-897b-4359-88df-ace9ae2f0517","Type":"ContainerStarted","Data":"1ca8e828303828ad5de75abec36dc1683793ce4cdbbc69bf060d112715837be6"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.092721 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.092707442 podStartE2EDuration="2.092707442s" podCreationTimestamp="2025-11-29 02:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:41.083699113 +0000 UTC m=+5744.255848960" watchObservedRunningTime="2025-11-29 02:46:41.092707442 +0000 UTC m=+5744.264857299" Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.092834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a0db52c-f857-49b8-81f8-adacdf9791d7","Type":"ContainerStarted","Data":"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.092859 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a0db52c-f857-49b8-81f8-adacdf9791d7","Type":"ContainerStarted","Data":"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.092869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a0db52c-f857-49b8-81f8-adacdf9791d7","Type":"ContainerStarted","Data":"e049b62c2faa7e5db8e19da786b1afdd2ddc7ac25ae0809416673d265fbdd30a"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.094446 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" event={"ID":"d6410f87-bd21-41e5-8f40-f7a9ea97da54","Type":"ContainerStarted","Data":"fa364c66da2b0c10a56bba19b95062b2dd2c9e8c2e6e8596701cfe10d7a238d3"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.094470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" event={"ID":"d6410f87-bd21-41e5-8f40-f7a9ea97da54","Type":"ContainerStarted","Data":"108493fce7b7710f83ccef74bd8b3bbb72784790c5cda3c25ae27a0ab0342411"} Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.174660 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" podStartSLOduration=2.174636324 podStartE2EDuration="2.174636324s" podCreationTimestamp="2025-11-29 02:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:41.166634999 +0000 UTC m=+5744.338784866" watchObservedRunningTime="2025-11-29 02:46:41.174636324 +0000 UTC m=+5744.346786181" Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.176555 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ghjjm" podStartSLOduration=3.17654731 podStartE2EDuration="3.17654731s" podCreationTimestamp="2025-11-29 02:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:41.147293579 +0000 UTC m=+5744.319443446" watchObservedRunningTime="2025-11-29 02:46:41.17654731 +0000 UTC m=+5744.348697177" Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.215333 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.215310492 podStartE2EDuration="2.215310492s" podCreationTimestamp="2025-11-29 02:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:41.186485612 +0000 UTC m=+5744.358635469" watchObservedRunningTime="2025-11-29 02:46:41.215310492 +0000 UTC m=+5744.387460369" Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.229409 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.229391765 podStartE2EDuration="2.229391765s" podCreationTimestamp="2025-11-29 02:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:41.212208217 +0000 UTC m=+5744.384358074" watchObservedRunningTime="2025-11-29 02:46:41.229391765 +0000 UTC m=+5744.401541622" Nov 29 02:46:41 crc kubenswrapper[4749]: I1129 02:46:41.241731 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.241713384 podStartE2EDuration="2.241713384s" podCreationTimestamp="2025-11-29 02:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:41.235901913 +0000 UTC m=+5744.408051780" watchObservedRunningTime="2025-11-29 02:46:41.241713384 +0000 UTC m=+5744.413863241" Nov 29 02:46:42 crc kubenswrapper[4749]: I1129 02:46:42.122346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" event={"ID":"aaa73ce0-0bac-482b-9076-cc55df3efd74","Type":"ContainerStarted","Data":"dd4ca9ae891caac20eb8b1abc5492777fdf1a7e82d7113e249b87586b02b0e5d"} Nov 29 02:46:42 crc kubenswrapper[4749]: I1129 02:46:42.162668 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" podStartSLOduration=3.162642007 podStartE2EDuration="3.162642007s" podCreationTimestamp="2025-11-29 02:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:42.148515703 +0000 UTC m=+5745.320665590" watchObservedRunningTime="2025-11-29 02:46:42.162642007 +0000 UTC m=+5745.334791894" Nov 29 02:46:43 crc kubenswrapper[4749]: I1129 02:46:43.130275 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:44 crc kubenswrapper[4749]: I1129 02:46:44.144901 4749 generic.go:334] "Generic (PLEG): container finished" podID="d6410f87-bd21-41e5-8f40-f7a9ea97da54" containerID="fa364c66da2b0c10a56bba19b95062b2dd2c9e8c2e6e8596701cfe10d7a238d3" exitCode=0 Nov 29 02:46:44 crc kubenswrapper[4749]: I1129 02:46:44.145420 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" event={"ID":"d6410f87-bd21-41e5-8f40-f7a9ea97da54","Type":"ContainerDied","Data":"fa364c66da2b0c10a56bba19b95062b2dd2c9e8c2e6e8596701cfe10d7a238d3"} Nov 29 02:46:44 crc kubenswrapper[4749]: I1129 02:46:44.741560 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 02:46:44 crc kubenswrapper[4749]: I1129 02:46:44.799304 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 02:46:44 crc kubenswrapper[4749]: I1129 02:46:44.799381 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 02:46:44 crc kubenswrapper[4749]: I1129 02:46:44.823927 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.075468 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:46:45 crc kubenswrapper[4749]: E1129 02:46:45.075836 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.160621 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" containerID="49b6b7ed109ea5fe37214b44cce0145d27b1102cadf4c143b3dcb6ef7c389331" exitCode=0 Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.160700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ghjjm" event={"ID":"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1","Type":"ContainerDied","Data":"49b6b7ed109ea5fe37214b44cce0145d27b1102cadf4c143b3dcb6ef7c389331"} Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.708250 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.901892 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-combined-ca-bundle\") pod \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.901975 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-config-data\") pod \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.902060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-scripts\") pod \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.902703 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cz7x\" (UniqueName: \"kubernetes.io/projected/d6410f87-bd21-41e5-8f40-f7a9ea97da54-kube-api-access-8cz7x\") pod \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\" (UID: \"d6410f87-bd21-41e5-8f40-f7a9ea97da54\") " Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.911618 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6410f87-bd21-41e5-8f40-f7a9ea97da54-kube-api-access-8cz7x" (OuterVolumeSpecName: "kube-api-access-8cz7x") pod "d6410f87-bd21-41e5-8f40-f7a9ea97da54" (UID: "d6410f87-bd21-41e5-8f40-f7a9ea97da54"). InnerVolumeSpecName "kube-api-access-8cz7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.913517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-scripts" (OuterVolumeSpecName: "scripts") pod "d6410f87-bd21-41e5-8f40-f7a9ea97da54" (UID: "d6410f87-bd21-41e5-8f40-f7a9ea97da54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.935999 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-config-data" (OuterVolumeSpecName: "config-data") pod "d6410f87-bd21-41e5-8f40-f7a9ea97da54" (UID: "d6410f87-bd21-41e5-8f40-f7a9ea97da54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:45 crc kubenswrapper[4749]: I1129 02:46:45.959947 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6410f87-bd21-41e5-8f40-f7a9ea97da54" (UID: "d6410f87-bd21-41e5-8f40-f7a9ea97da54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.004554 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.004897 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.005029 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6410f87-bd21-41e5-8f40-f7a9ea97da54-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.005142 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cz7x\" (UniqueName: \"kubernetes.io/projected/d6410f87-bd21-41e5-8f40-f7a9ea97da54-kube-api-access-8cz7x\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.173358 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" event={"ID":"d6410f87-bd21-41e5-8f40-f7a9ea97da54","Type":"ContainerDied","Data":"108493fce7b7710f83ccef74bd8b3bbb72784790c5cda3c25ae27a0ab0342411"} Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.176547 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="108493fce7b7710f83ccef74bd8b3bbb72784790c5cda3c25ae27a0ab0342411" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.173386 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cmpmq" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.274417 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:46:46 crc kubenswrapper[4749]: E1129 02:46:46.274815 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6410f87-bd21-41e5-8f40-f7a9ea97da54" containerName="nova-cell1-conductor-db-sync" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.274827 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6410f87-bd21-41e5-8f40-f7a9ea97da54" containerName="nova-cell1-conductor-db-sync" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.275008 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6410f87-bd21-41e5-8f40-f7a9ea97da54" containerName="nova-cell1-conductor-db-sync" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.275646 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.280240 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.315590 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.315666 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.315760 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drgk7\" (UniqueName: \"kubernetes.io/projected/7b019e06-5331-4fc8-b736-dd6e5670c45c-kube-api-access-drgk7\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.319387 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.416885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.416973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drgk7\" (UniqueName: \"kubernetes.io/projected/7b019e06-5331-4fc8-b736-dd6e5670c45c-kube-api-access-drgk7\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.417023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.421217 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.422006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.434734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drgk7\" (UniqueName: \"kubernetes.io/projected/7b019e06-5331-4fc8-b736-dd6e5670c45c-kube-api-access-drgk7\") pod \"nova-cell1-conductor-0\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.589909 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.601736 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.620837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-scripts\") pod \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.620874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-combined-ca-bundle\") pod \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.620925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqkwv\" (UniqueName: \"kubernetes.io/projected/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-kube-api-access-vqkwv\") pod \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.620948 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-config-data\") pod \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\" (UID: \"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1\") " Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.634774 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-kube-api-access-vqkwv" (OuterVolumeSpecName: "kube-api-access-vqkwv") pod "ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" (UID: "ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1"). InnerVolumeSpecName "kube-api-access-vqkwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.640133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-scripts" (OuterVolumeSpecName: "scripts") pod "ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" (UID: "ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.664791 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" (UID: "ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.669079 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-config-data" (OuterVolumeSpecName: "config-data") pod "ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" (UID: "ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.723113 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.723476 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.723492 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqkwv\" (UniqueName: \"kubernetes.io/projected/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-kube-api-access-vqkwv\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:46 crc kubenswrapper[4749]: I1129 02:46:46.723506 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:47 crc kubenswrapper[4749]: W1129 02:46:47.083116 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b019e06_5331_4fc8_b736_dd6e5670c45c.slice/crio-db9270c6dfe098126977ffe5d502d3884a9515546ecbd1a33368065d30652c0e WatchSource:0}: Error finding container db9270c6dfe098126977ffe5d502d3884a9515546ecbd1a33368065d30652c0e: Status 404 returned error can't find the container with id db9270c6dfe098126977ffe5d502d3884a9515546ecbd1a33368065d30652c0e Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.095797 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.184504 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ghjjm" Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.184566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ghjjm" event={"ID":"ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1","Type":"ContainerDied","Data":"64ce22c1281adf6c5def05db5bde91426142e1b069814039a8560a64374be74b"} Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.184608 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64ce22c1281adf6c5def05db5bde91426142e1b069814039a8560a64374be74b" Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.185820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7b019e06-5331-4fc8-b736-dd6e5670c45c","Type":"ContainerStarted","Data":"db9270c6dfe098126977ffe5d502d3884a9515546ecbd1a33368065d30652c0e"} Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.380330 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.380618 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cb7391ad-897b-4359-88df-ace9ae2f0517" containerName="nova-scheduler-scheduler" containerID="cri-o://37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f" gracePeriod=30 Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.389125 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.389391 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerName="nova-api-api" containerID="cri-o://a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6" gracePeriod=30 Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.389560 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerName="nova-api-log" containerID="cri-o://790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb" gracePeriod=30 Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.448450 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.448747 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerName="nova-metadata-log" containerID="cri-o://5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2" gracePeriod=30 Nov 29 02:46:47 crc kubenswrapper[4749]: I1129 02:46:47.448837 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerName="nova-metadata-metadata" containerID="cri-o://049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb" gracePeriod=30 Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.135175 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.141480 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.220102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7b019e06-5331-4fc8-b736-dd6e5670c45c","Type":"ContainerStarted","Data":"16a5834f7550b4b2fae489791dea50dada2a6621a59520c5a9b1a9075200778a"} Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.220806 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.222682 4749 generic.go:334] "Generic (PLEG): container finished" podID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerID="049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb" exitCode=0 Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.222705 4749 generic.go:334] "Generic (PLEG): container finished" podID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerID="5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2" exitCode=143 Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.222753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f55b697-cee2-42dd-bcb8-898283e0da54","Type":"ContainerDied","Data":"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb"} Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.222779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f55b697-cee2-42dd-bcb8-898283e0da54","Type":"ContainerDied","Data":"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2"} Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.222789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f55b697-cee2-42dd-bcb8-898283e0da54","Type":"ContainerDied","Data":"ec264bd3024ad4cdbc85b0f522e975c41f27a4b08808ccdd5bb9d787a9ae6027"} Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.222805 4749 scope.go:117] "RemoveContainer" containerID="049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.222954 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.226806 4749 generic.go:334] "Generic (PLEG): container finished" podID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerID="a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6" exitCode=0 Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.226826 4749 generic.go:334] "Generic (PLEG): container finished" podID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerID="790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb" exitCode=143 Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.226844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a0db52c-f857-49b8-81f8-adacdf9791d7","Type":"ContainerDied","Data":"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6"} Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.226860 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a0db52c-f857-49b8-81f8-adacdf9791d7","Type":"ContainerDied","Data":"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb"} Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.226870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a0db52c-f857-49b8-81f8-adacdf9791d7","Type":"ContainerDied","Data":"e049b62c2faa7e5db8e19da786b1afdd2ddc7ac25ae0809416673d265fbdd30a"} Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.226924 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.245506 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.245490202 podStartE2EDuration="2.245490202s" podCreationTimestamp="2025-11-29 02:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:48.234862464 +0000 UTC m=+5751.407012331" watchObservedRunningTime="2025-11-29 02:46:48.245490202 +0000 UTC m=+5751.417640059" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.252453 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0db52c-f857-49b8-81f8-adacdf9791d7-logs\") pod \"5a0db52c-f857-49b8-81f8-adacdf9791d7\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.252496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-combined-ca-bundle\") pod \"5a0db52c-f857-49b8-81f8-adacdf9791d7\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.252617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z46vt\" (UniqueName: \"kubernetes.io/projected/7f55b697-cee2-42dd-bcb8-898283e0da54-kube-api-access-z46vt\") pod \"7f55b697-cee2-42dd-bcb8-898283e0da54\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.252647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-combined-ca-bundle\") pod \"7f55b697-cee2-42dd-bcb8-898283e0da54\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.252711 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-config-data\") pod \"7f55b697-cee2-42dd-bcb8-898283e0da54\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.252729 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpppq\" (UniqueName: \"kubernetes.io/projected/5a0db52c-f857-49b8-81f8-adacdf9791d7-kube-api-access-wpppq\") pod \"5a0db52c-f857-49b8-81f8-adacdf9791d7\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.252770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f55b697-cee2-42dd-bcb8-898283e0da54-logs\") pod \"7f55b697-cee2-42dd-bcb8-898283e0da54\" (UID: \"7f55b697-cee2-42dd-bcb8-898283e0da54\") " Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.252786 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-config-data\") pod \"5a0db52c-f857-49b8-81f8-adacdf9791d7\" (UID: \"5a0db52c-f857-49b8-81f8-adacdf9791d7\") " Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.253259 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a0db52c-f857-49b8-81f8-adacdf9791d7-logs" (OuterVolumeSpecName: "logs") pod "5a0db52c-f857-49b8-81f8-adacdf9791d7" (UID: "5a0db52c-f857-49b8-81f8-adacdf9791d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.253623 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f55b697-cee2-42dd-bcb8-898283e0da54-logs" (OuterVolumeSpecName: "logs") pod "7f55b697-cee2-42dd-bcb8-898283e0da54" (UID: "7f55b697-cee2-42dd-bcb8-898283e0da54"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.258829 4749 scope.go:117] "RemoveContainer" containerID="5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.258869 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0db52c-f857-49b8-81f8-adacdf9791d7-kube-api-access-wpppq" (OuterVolumeSpecName: "kube-api-access-wpppq") pod "5a0db52c-f857-49b8-81f8-adacdf9791d7" (UID: "5a0db52c-f857-49b8-81f8-adacdf9791d7"). InnerVolumeSpecName "kube-api-access-wpppq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.270528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f55b697-cee2-42dd-bcb8-898283e0da54-kube-api-access-z46vt" (OuterVolumeSpecName: "kube-api-access-z46vt") pod "7f55b697-cee2-42dd-bcb8-898283e0da54" (UID: "7f55b697-cee2-42dd-bcb8-898283e0da54"). InnerVolumeSpecName "kube-api-access-z46vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.284017 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a0db52c-f857-49b8-81f8-adacdf9791d7" (UID: "5a0db52c-f857-49b8-81f8-adacdf9791d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.284484 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-config-data" (OuterVolumeSpecName: "config-data") pod "7f55b697-cee2-42dd-bcb8-898283e0da54" (UID: "7f55b697-cee2-42dd-bcb8-898283e0da54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.285370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-config-data" (OuterVolumeSpecName: "config-data") pod "5a0db52c-f857-49b8-81f8-adacdf9791d7" (UID: "5a0db52c-f857-49b8-81f8-adacdf9791d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.291394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f55b697-cee2-42dd-bcb8-898283e0da54" (UID: "7f55b697-cee2-42dd-bcb8-898283e0da54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.333696 4749 scope.go:117] "RemoveContainer" containerID="049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb" Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.334224 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb\": container with ID starting with 049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb not found: ID does not exist" containerID="049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.334276 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb"} err="failed to get container status \"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb\": rpc error: code = NotFound desc = could not find container \"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb\": container with ID starting with 049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb not found: ID does not exist" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.334307 4749 scope.go:117] "RemoveContainer" containerID="5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2" Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.334798 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2\": container with ID starting with 5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2 not found: ID does not exist" containerID="5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.334825 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2"} err="failed to get container status \"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2\": rpc error: code = NotFound desc = could not find container \"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2\": container with ID starting with 5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2 not found: ID does not exist" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.334848 4749 scope.go:117] "RemoveContainer" containerID="049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.335116 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb"} err="failed to get container status \"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb\": rpc error: code = NotFound desc = could not find container \"049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb\": container with ID starting with 049f13f548d866a9de9454adc50382a1969b4ace4d4eacb77ccd02edb52757eb not found: ID does not exist" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.335164 4749 scope.go:117] "RemoveContainer" containerID="5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.335473 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2"} err="failed to get container status \"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2\": rpc error: code = NotFound desc = could not find container \"5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2\": container with ID starting with 5321c98a484fad24b2852728464ce517b516bf6f335485f2def397f9f4e48af2 not found: ID does not exist" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.335497 4749 scope.go:117] "RemoveContainer" containerID="a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.354427 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.355063 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpppq\" (UniqueName: \"kubernetes.io/projected/5a0db52c-f857-49b8-81f8-adacdf9791d7-kube-api-access-wpppq\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.355092 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f55b697-cee2-42dd-bcb8-898283e0da54-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.355105 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.355118 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a0db52c-f857-49b8-81f8-adacdf9791d7-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.355130 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0db52c-f857-49b8-81f8-adacdf9791d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.355143 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z46vt\" (UniqueName: \"kubernetes.io/projected/7f55b697-cee2-42dd-bcb8-898283e0da54-kube-api-access-z46vt\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.355189 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f55b697-cee2-42dd-bcb8-898283e0da54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.359009 4749 scope.go:117] "RemoveContainer" containerID="790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.386688 4749 scope.go:117] "RemoveContainer" containerID="a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6" Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.387041 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6\": container with ID starting with a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6 not found: ID does not exist" containerID="a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.387070 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6"} err="failed to get container status \"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6\": rpc error: code = NotFound desc = could not find container \"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6\": container with ID starting with a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6 not found: ID does not exist" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.387092 4749 scope.go:117] "RemoveContainer" containerID="790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb" Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.387465 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb\": container with ID starting with 790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb not found: ID does not exist" containerID="790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.387487 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb"} err="failed to get container status \"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb\": rpc error: code = NotFound desc = could not find container \"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb\": container with ID starting with 790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb not found: ID does not exist" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.387500 4749 scope.go:117] "RemoveContainer" containerID="a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.387805 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6"} err="failed to get container status \"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6\": rpc error: code = NotFound desc = could not find container \"a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6\": container with ID starting with a4f7424813a76fd4fae56dc3147143e67a4fa5707d7f70f7399b3fec0cfd88e6 not found: ID does not exist" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.387828 4749 scope.go:117] "RemoveContainer" containerID="790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.388058 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb"} err="failed to get container status \"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb\": rpc error: code = NotFound desc = could not find container \"790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb\": container with ID starting with 790701feb0a420ea11cf743b2cacb26387fed5bd2680c003d58fc9ce4d180fbb not found: ID does not exist" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.583466 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.599663 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.611915 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.624092 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.624892 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" containerName="nova-manage" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.625034 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" containerName="nova-manage" Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.625137 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerName="nova-api-api" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.625248 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerName="nova-api-api" Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.625350 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerName="nova-api-log" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.625439 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerName="nova-api-log" Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.625555 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerName="nova-metadata-log" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.625877 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerName="nova-metadata-log" Nov 29 02:46:48 crc kubenswrapper[4749]: E1129 02:46:48.626005 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerName="nova-metadata-metadata" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.626101 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerName="nova-metadata-metadata" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.626478 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerName="nova-api-api" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.626571 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" containerName="nova-manage" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.626646 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" containerName="nova-api-log" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.626722 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerName="nova-metadata-metadata" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.626802 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" containerName="nova-metadata-log" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.628176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.632857 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.636620 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.649075 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.659791 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.661958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.668284 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.680157 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.767630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-logs\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.767728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6ed9-2130-43b1-9c92-2939c5f79679-logs\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.767990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.768148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-config-data\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.768275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj2mx\" (UniqueName: \"kubernetes.io/projected/9a3e6ed9-2130-43b1-9c92-2939c5f79679-kube-api-access-cj2mx\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.768323 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89pgq\" (UniqueName: \"kubernetes.io/projected/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-kube-api-access-89pgq\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.768547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-config-data\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.768778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.870868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.870980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-logs\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.871009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6ed9-2130-43b1-9c92-2939c5f79679-logs\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.871041 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.871088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-config-data\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.871118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj2mx\" (UniqueName: \"kubernetes.io/projected/9a3e6ed9-2130-43b1-9c92-2939c5f79679-kube-api-access-cj2mx\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.871146 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89pgq\" (UniqueName: \"kubernetes.io/projected/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-kube-api-access-89pgq\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.871260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-config-data\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.871977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-logs\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.871996 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6ed9-2130-43b1-9c92-2939c5f79679-logs\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.875724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.877280 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.878639 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-config-data\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.888186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-config-data\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.893788 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89pgq\" (UniqueName: \"kubernetes.io/projected/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-kube-api-access-89pgq\") pod \"nova-api-0\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " pod="openstack/nova-api-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.914021 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj2mx\" (UniqueName: \"kubernetes.io/projected/9a3e6ed9-2130-43b1-9c92-2939c5f79679-kube-api-access-cj2mx\") pod \"nova-metadata-0\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " pod="openstack/nova-metadata-0" Nov 29 02:46:48 crc kubenswrapper[4749]: I1129 02:46:48.993503 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.005606 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.103109 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a0db52c-f857-49b8-81f8-adacdf9791d7" path="/var/lib/kubelet/pods/5a0db52c-f857-49b8-81f8-adacdf9791d7/volumes" Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.103830 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f55b697-cee2-42dd-bcb8-898283e0da54" path="/var/lib/kubelet/pods/7f55b697-cee2-42dd-bcb8-898283e0da54/volumes" Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.522082 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:46:49 crc kubenswrapper[4749]: W1129 02:46:49.526931 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb6e2bf_0723_419d_a4f4_e15cc03ec7a4.slice/crio-2669a641e4b7e0ec2c2664d692669d59f1014fc13b4df32d4281c90b99b4615c WatchSource:0}: Error finding container 2669a641e4b7e0ec2c2664d692669d59f1014fc13b4df32d4281c90b99b4615c: Status 404 returned error can't find the container with id 2669a641e4b7e0ec2c2664d692669d59f1014fc13b4df32d4281c90b99b4615c Nov 29 02:46:49 crc kubenswrapper[4749]: W1129 02:46:49.595863 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a3e6ed9_2130_43b1_9c92_2939c5f79679.slice/crio-52046438b44981ba9274c3a21538487a28ef027576833840cf565218c3f544d8 WatchSource:0}: Error finding container 52046438b44981ba9274c3a21538487a28ef027576833840cf565218c3f544d8: Status 404 returned error can't find the container with id 52046438b44981ba9274c3a21538487a28ef027576833840cf565218c3f544d8 Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.595992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.824363 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.835410 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.836623 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.970722 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94478fffc-flf7q"] Nov 29 02:46:49 crc kubenswrapper[4749]: I1129 02:46:49.970987 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-94478fffc-flf7q" podUID="c3537239-98b7-4e2d-a800-36f8cdac49fe" containerName="dnsmasq-dns" containerID="cri-o://cf45b96353708baff62b8651c040b67d94020a81c9fd3bfd5af439baa31194b8" gracePeriod=10 Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.273901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a3e6ed9-2130-43b1-9c92-2939c5f79679","Type":"ContainerStarted","Data":"b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76"} Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.274165 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a3e6ed9-2130-43b1-9c92-2939c5f79679","Type":"ContainerStarted","Data":"8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a"} Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.274178 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a3e6ed9-2130-43b1-9c92-2939c5f79679","Type":"ContainerStarted","Data":"52046438b44981ba9274c3a21538487a28ef027576833840cf565218c3f544d8"} Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.277279 4749 generic.go:334] "Generic (PLEG): container finished" podID="c3537239-98b7-4e2d-a800-36f8cdac49fe" containerID="cf45b96353708baff62b8651c040b67d94020a81c9fd3bfd5af439baa31194b8" exitCode=0 Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.277365 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94478fffc-flf7q" event={"ID":"c3537239-98b7-4e2d-a800-36f8cdac49fe","Type":"ContainerDied","Data":"cf45b96353708baff62b8651c040b67d94020a81c9fd3bfd5af439baa31194b8"} Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.280319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4","Type":"ContainerStarted","Data":"faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562"} Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.280445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4","Type":"ContainerStarted","Data":"a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0"} Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.280525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4","Type":"ContainerStarted","Data":"2669a641e4b7e0ec2c2664d692669d59f1014fc13b4df32d4281c90b99b4615c"} Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.292960 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.292945325 podStartE2EDuration="2.292945325s" podCreationTimestamp="2025-11-29 02:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:50.289578454 +0000 UTC m=+5753.461728321" watchObservedRunningTime="2025-11-29 02:46:50.292945325 +0000 UTC m=+5753.465095182" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.302171 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.325404 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.325385884 podStartE2EDuration="2.325385884s" podCreationTimestamp="2025-11-29 02:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:50.318559648 +0000 UTC m=+5753.490709505" watchObservedRunningTime="2025-11-29 02:46:50.325385884 +0000 UTC m=+5753.497535741" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.436896 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.628556 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-sb\") pod \"c3537239-98b7-4e2d-a800-36f8cdac49fe\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.628661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-dns-svc\") pod \"c3537239-98b7-4e2d-a800-36f8cdac49fe\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.628738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znj6j\" (UniqueName: \"kubernetes.io/projected/c3537239-98b7-4e2d-a800-36f8cdac49fe-kube-api-access-znj6j\") pod \"c3537239-98b7-4e2d-a800-36f8cdac49fe\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.628780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-config\") pod \"c3537239-98b7-4e2d-a800-36f8cdac49fe\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.629044 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-nb\") pod \"c3537239-98b7-4e2d-a800-36f8cdac49fe\" (UID: \"c3537239-98b7-4e2d-a800-36f8cdac49fe\") " Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.637440 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3537239-98b7-4e2d-a800-36f8cdac49fe-kube-api-access-znj6j" (OuterVolumeSpecName: "kube-api-access-znj6j") pod "c3537239-98b7-4e2d-a800-36f8cdac49fe" (UID: "c3537239-98b7-4e2d-a800-36f8cdac49fe"). InnerVolumeSpecName "kube-api-access-znj6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.669353 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3537239-98b7-4e2d-a800-36f8cdac49fe" (UID: "c3537239-98b7-4e2d-a800-36f8cdac49fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.682172 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-config" (OuterVolumeSpecName: "config") pod "c3537239-98b7-4e2d-a800-36f8cdac49fe" (UID: "c3537239-98b7-4e2d-a800-36f8cdac49fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.695983 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3537239-98b7-4e2d-a800-36f8cdac49fe" (UID: "c3537239-98b7-4e2d-a800-36f8cdac49fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.697287 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3537239-98b7-4e2d-a800-36f8cdac49fe" (UID: "c3537239-98b7-4e2d-a800-36f8cdac49fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.731644 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.731691 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.731711 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.731728 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znj6j\" (UniqueName: \"kubernetes.io/projected/c3537239-98b7-4e2d-a800-36f8cdac49fe-kube-api-access-znj6j\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:50 crc kubenswrapper[4749]: I1129 02:46:50.731743 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3537239-98b7-4e2d-a800-36f8cdac49fe-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:51 crc kubenswrapper[4749]: I1129 02:46:51.292330 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94478fffc-flf7q" Nov 29 02:46:51 crc kubenswrapper[4749]: I1129 02:46:51.292596 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94478fffc-flf7q" event={"ID":"c3537239-98b7-4e2d-a800-36f8cdac49fe","Type":"ContainerDied","Data":"1604568eed9be31a843ec034fc4e79fc88f33655205a7713481452cac05e07ce"} Nov 29 02:46:51 crc kubenswrapper[4749]: I1129 02:46:51.292696 4749 scope.go:117] "RemoveContainer" containerID="cf45b96353708baff62b8651c040b67d94020a81c9fd3bfd5af439baa31194b8" Nov 29 02:46:51 crc kubenswrapper[4749]: I1129 02:46:51.319356 4749 scope.go:117] "RemoveContainer" containerID="58d53ddbb1814f6b1a145376d1a9e209cdff8b3693a99395ea76ce44744125e3" Nov 29 02:46:51 crc kubenswrapper[4749]: I1129 02:46:51.321552 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94478fffc-flf7q"] Nov 29 02:46:51 crc kubenswrapper[4749]: I1129 02:46:51.332688 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94478fffc-flf7q"] Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.121037 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.270976 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-config-data\") pod \"cb7391ad-897b-4359-88df-ace9ae2f0517\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.271471 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-combined-ca-bundle\") pod \"cb7391ad-897b-4359-88df-ace9ae2f0517\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.271670 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phmk5\" (UniqueName: \"kubernetes.io/projected/cb7391ad-897b-4359-88df-ace9ae2f0517-kube-api-access-phmk5\") pod \"cb7391ad-897b-4359-88df-ace9ae2f0517\" (UID: \"cb7391ad-897b-4359-88df-ace9ae2f0517\") " Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.279060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7391ad-897b-4359-88df-ace9ae2f0517-kube-api-access-phmk5" (OuterVolumeSpecName: "kube-api-access-phmk5") pod "cb7391ad-897b-4359-88df-ace9ae2f0517" (UID: "cb7391ad-897b-4359-88df-ace9ae2f0517"). InnerVolumeSpecName "kube-api-access-phmk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.304243 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb7391ad-897b-4359-88df-ace9ae2f0517" (UID: "cb7391ad-897b-4359-88df-ace9ae2f0517"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.310138 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-config-data" (OuterVolumeSpecName: "config-data") pod "cb7391ad-897b-4359-88df-ace9ae2f0517" (UID: "cb7391ad-897b-4359-88df-ace9ae2f0517"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.310748 4749 generic.go:334] "Generic (PLEG): container finished" podID="cb7391ad-897b-4359-88df-ace9ae2f0517" containerID="37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f" exitCode=0 Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.310796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb7391ad-897b-4359-88df-ace9ae2f0517","Type":"ContainerDied","Data":"37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f"} Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.310848 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb7391ad-897b-4359-88df-ace9ae2f0517","Type":"ContainerDied","Data":"1ca8e828303828ad5de75abec36dc1683793ce4cdbbc69bf060d112715837be6"} Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.310877 4749 scope.go:117] "RemoveContainer" containerID="37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.310983 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.373642 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.373675 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7391ad-897b-4359-88df-ace9ae2f0517-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.373686 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phmk5\" (UniqueName: \"kubernetes.io/projected/cb7391ad-897b-4359-88df-ace9ae2f0517-kube-api-access-phmk5\") on node \"crc\" DevicePath \"\"" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.395539 4749 scope.go:117] "RemoveContainer" containerID="37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f" Nov 29 02:46:52 crc kubenswrapper[4749]: E1129 02:46:52.399737 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f\": container with ID starting with 37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f not found: ID does not exist" containerID="37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.399793 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f"} err="failed to get container status \"37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f\": rpc error: code = NotFound desc = could not find container \"37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f\": container with ID starting with 37df99ea98e821fb962f57eb5b2f2f79cbeb8758b5a8ca9e9b9196cc1c913e8f not found: ID does not exist" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.402635 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.419541 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.435293 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:52 crc kubenswrapper[4749]: E1129 02:46:52.435737 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7391ad-897b-4359-88df-ace9ae2f0517" containerName="nova-scheduler-scheduler" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.435755 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7391ad-897b-4359-88df-ace9ae2f0517" containerName="nova-scheduler-scheduler" Nov 29 02:46:52 crc kubenswrapper[4749]: E1129 02:46:52.435777 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3537239-98b7-4e2d-a800-36f8cdac49fe" containerName="init" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.435785 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3537239-98b7-4e2d-a800-36f8cdac49fe" containerName="init" Nov 29 02:46:52 crc kubenswrapper[4749]: E1129 02:46:52.435809 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3537239-98b7-4e2d-a800-36f8cdac49fe" containerName="dnsmasq-dns" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.435814 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3537239-98b7-4e2d-a800-36f8cdac49fe" containerName="dnsmasq-dns" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.435988 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3537239-98b7-4e2d-a800-36f8cdac49fe" containerName="dnsmasq-dns" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.436002 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7391ad-897b-4359-88df-ace9ae2f0517" containerName="nova-scheduler-scheduler" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.436650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.439111 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.446700 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.475310 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.475633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-config-data\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.475787 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmrbv\" (UniqueName: \"kubernetes.io/projected/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-kube-api-access-dmrbv\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.578108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.578178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-config-data\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.578235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmrbv\" (UniqueName: \"kubernetes.io/projected/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-kube-api-access-dmrbv\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.582717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.583630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-config-data\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.593985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmrbv\" (UniqueName: \"kubernetes.io/projected/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-kube-api-access-dmrbv\") pod \"nova-scheduler-0\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " pod="openstack/nova-scheduler-0" Nov 29 02:46:52 crc kubenswrapper[4749]: I1129 02:46:52.752741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:46:53 crc kubenswrapper[4749]: I1129 02:46:53.088431 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3537239-98b7-4e2d-a800-36f8cdac49fe" path="/var/lib/kubelet/pods/c3537239-98b7-4e2d-a800-36f8cdac49fe/volumes" Nov 29 02:46:53 crc kubenswrapper[4749]: I1129 02:46:53.092231 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb7391ad-897b-4359-88df-ace9ae2f0517" path="/var/lib/kubelet/pods/cb7391ad-897b-4359-88df-ace9ae2f0517/volumes" Nov 29 02:46:53 crc kubenswrapper[4749]: W1129 02:46:53.251531 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2cca797_3b4a_4fe1_a7ea_bb2fe54d1481.slice/crio-c0bc157434bed3a39a52801a40cc592c348b7e46e0b02f07fdf4a98429f90bd2 WatchSource:0}: Error finding container c0bc157434bed3a39a52801a40cc592c348b7e46e0b02f07fdf4a98429f90bd2: Status 404 returned error can't find the container with id c0bc157434bed3a39a52801a40cc592c348b7e46e0b02f07fdf4a98429f90bd2 Nov 29 02:46:53 crc kubenswrapper[4749]: I1129 02:46:53.251635 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:46:53 crc kubenswrapper[4749]: I1129 02:46:53.326816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481","Type":"ContainerStarted","Data":"c0bc157434bed3a39a52801a40cc592c348b7e46e0b02f07fdf4a98429f90bd2"} Nov 29 02:46:53 crc kubenswrapper[4749]: I1129 02:46:53.994430 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 02:46:53 crc kubenswrapper[4749]: I1129 02:46:53.994851 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 02:46:54 crc kubenswrapper[4749]: I1129 02:46:54.344183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481","Type":"ContainerStarted","Data":"2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9"} Nov 29 02:46:54 crc kubenswrapper[4749]: I1129 02:46:54.379030 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.379008208 podStartE2EDuration="2.379008208s" podCreationTimestamp="2025-11-29 02:46:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:54.370470051 +0000 UTC m=+5757.542619948" watchObservedRunningTime="2025-11-29 02:46:54.379008208 +0000 UTC m=+5757.551158075" Nov 29 02:46:56 crc kubenswrapper[4749]: I1129 02:46:56.654050 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.173466 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4872k"] Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.175631 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.178963 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.179693 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.239187 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4872k"] Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.319276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-config-data\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.319422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk4mw\" (UniqueName: \"kubernetes.io/projected/3f1aab52-f574-4c9e-aef6-735905de460f-kube-api-access-bk4mw\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.319559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.319643 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-scripts\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.421549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk4mw\" (UniqueName: \"kubernetes.io/projected/3f1aab52-f574-4c9e-aef6-735905de460f-kube-api-access-bk4mw\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.421669 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.421758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-scripts\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.421820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-config-data\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.430080 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-config-data\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.430766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.442121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-scripts\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.455084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk4mw\" (UniqueName: \"kubernetes.io/projected/3f1aab52-f574-4c9e-aef6-735905de460f-kube-api-access-bk4mw\") pod \"nova-cell1-cell-mapping-4872k\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.546683 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:46:57 crc kubenswrapper[4749]: I1129 02:46:57.754005 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 02:46:58 crc kubenswrapper[4749]: I1129 02:46:58.073804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4872k"] Nov 29 02:46:58 crc kubenswrapper[4749]: W1129 02:46:58.077175 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f1aab52_f574_4c9e_aef6_735905de460f.slice/crio-a58ce707883c939efac69c3893e645b1d8e64a50d751f4f4e9c99a26f9cf5512 WatchSource:0}: Error finding container a58ce707883c939efac69c3893e645b1d8e64a50d751f4f4e9c99a26f9cf5512: Status 404 returned error can't find the container with id a58ce707883c939efac69c3893e645b1d8e64a50d751f4f4e9c99a26f9cf5512 Nov 29 02:46:58 crc kubenswrapper[4749]: I1129 02:46:58.389529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4872k" event={"ID":"3f1aab52-f574-4c9e-aef6-735905de460f","Type":"ContainerStarted","Data":"c5b19be98784276127790013d8bf9be97e8dc033c7284b03f529e958d5803923"} Nov 29 02:46:58 crc kubenswrapper[4749]: I1129 02:46:58.389594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4872k" event={"ID":"3f1aab52-f574-4c9e-aef6-735905de460f","Type":"ContainerStarted","Data":"a58ce707883c939efac69c3893e645b1d8e64a50d751f4f4e9c99a26f9cf5512"} Nov 29 02:46:58 crc kubenswrapper[4749]: I1129 02:46:58.411091 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4872k" podStartSLOduration=1.411066498 podStartE2EDuration="1.411066498s" podCreationTimestamp="2025-11-29 02:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:46:58.403668769 +0000 UTC m=+5761.575818626" watchObservedRunningTime="2025-11-29 02:46:58.411066498 +0000 UTC m=+5761.583216385" Nov 29 02:46:58 crc kubenswrapper[4749]: I1129 02:46:58.994317 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 02:46:58 crc kubenswrapper[4749]: I1129 02:46:58.995936 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 02:46:59 crc kubenswrapper[4749]: I1129 02:46:59.006035 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 02:46:59 crc kubenswrapper[4749]: I1129 02:46:59.006087 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 02:47:00 crc kubenswrapper[4749]: I1129 02:47:00.076054 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:47:00 crc kubenswrapper[4749]: E1129 02:47:00.077392 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:47:00 crc kubenswrapper[4749]: I1129 02:47:00.158591 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:47:00 crc kubenswrapper[4749]: I1129 02:47:00.158617 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:47:00 crc kubenswrapper[4749]: I1129 02:47:00.158658 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.67:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:47:00 crc kubenswrapper[4749]: I1129 02:47:00.158669 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:47:02 crc kubenswrapper[4749]: I1129 02:47:02.753364 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 02:47:02 crc kubenswrapper[4749]: I1129 02:47:02.780780 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 02:47:03 crc kubenswrapper[4749]: I1129 02:47:03.447864 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f1aab52-f574-4c9e-aef6-735905de460f" containerID="c5b19be98784276127790013d8bf9be97e8dc033c7284b03f529e958d5803923" exitCode=0 Nov 29 02:47:03 crc kubenswrapper[4749]: I1129 02:47:03.447984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4872k" event={"ID":"3f1aab52-f574-4c9e-aef6-735905de460f","Type":"ContainerDied","Data":"c5b19be98784276127790013d8bf9be97e8dc033c7284b03f529e958d5803923"} Nov 29 02:47:03 crc kubenswrapper[4749]: I1129 02:47:03.500411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.825732 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.885222 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk4mw\" (UniqueName: \"kubernetes.io/projected/3f1aab52-f574-4c9e-aef6-735905de460f-kube-api-access-bk4mw\") pod \"3f1aab52-f574-4c9e-aef6-735905de460f\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.885287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-scripts\") pod \"3f1aab52-f574-4c9e-aef6-735905de460f\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.885394 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-config-data\") pod \"3f1aab52-f574-4c9e-aef6-735905de460f\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.885569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-combined-ca-bundle\") pod \"3f1aab52-f574-4c9e-aef6-735905de460f\" (UID: \"3f1aab52-f574-4c9e-aef6-735905de460f\") " Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.891952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-scripts" (OuterVolumeSpecName: "scripts") pod "3f1aab52-f574-4c9e-aef6-735905de460f" (UID: "3f1aab52-f574-4c9e-aef6-735905de460f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.893347 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f1aab52-f574-4c9e-aef6-735905de460f-kube-api-access-bk4mw" (OuterVolumeSpecName: "kube-api-access-bk4mw") pod "3f1aab52-f574-4c9e-aef6-735905de460f" (UID: "3f1aab52-f574-4c9e-aef6-735905de460f"). InnerVolumeSpecName "kube-api-access-bk4mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.930384 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-config-data" (OuterVolumeSpecName: "config-data") pod "3f1aab52-f574-4c9e-aef6-735905de460f" (UID: "3f1aab52-f574-4c9e-aef6-735905de460f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.937422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f1aab52-f574-4c9e-aef6-735905de460f" (UID: "3f1aab52-f574-4c9e-aef6-735905de460f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.990972 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk4mw\" (UniqueName: \"kubernetes.io/projected/3f1aab52-f574-4c9e-aef6-735905de460f-kube-api-access-bk4mw\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.991018 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.991082 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:04 crc kubenswrapper[4749]: I1129 02:47:04.991099 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f1aab52-f574-4c9e-aef6-735905de460f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.472790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4872k" event={"ID":"3f1aab52-f574-4c9e-aef6-735905de460f","Type":"ContainerDied","Data":"a58ce707883c939efac69c3893e645b1d8e64a50d751f4f4e9c99a26f9cf5512"} Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.472844 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58ce707883c939efac69c3893e645b1d8e64a50d751f4f4e9c99a26f9cf5512" Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.473356 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4872k" Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.665036 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.665600 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-log" containerID="cri-o://a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0" gracePeriod=30 Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.665697 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-api" containerID="cri-o://faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562" gracePeriod=30 Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.735609 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.735831 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" containerName="nova-scheduler-scheduler" containerID="cri-o://2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9" gracePeriod=30 Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.773187 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.774657 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-log" containerID="cri-o://8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a" gracePeriod=30 Nov 29 02:47:05 crc kubenswrapper[4749]: I1129 02:47:05.775419 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-metadata" containerID="cri-o://b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76" gracePeriod=30 Nov 29 02:47:06 crc kubenswrapper[4749]: I1129 02:47:06.483567 4749 generic.go:334] "Generic (PLEG): container finished" podID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerID="8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a" exitCode=143 Nov 29 02:47:06 crc kubenswrapper[4749]: I1129 02:47:06.483676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a3e6ed9-2130-43b1-9c92-2939c5f79679","Type":"ContainerDied","Data":"8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a"} Nov 29 02:47:06 crc kubenswrapper[4749]: I1129 02:47:06.486508 4749 generic.go:334] "Generic (PLEG): container finished" podID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerID="a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0" exitCode=143 Nov 29 02:47:06 crc kubenswrapper[4749]: I1129 02:47:06.486555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4","Type":"ContainerDied","Data":"a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0"} Nov 29 02:47:07 crc kubenswrapper[4749]: E1129 02:47:07.756056 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 02:47:07 crc kubenswrapper[4749]: E1129 02:47:07.759609 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 02:47:07 crc kubenswrapper[4749]: E1129 02:47:07.762074 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 02:47:07 crc kubenswrapper[4749]: E1129 02:47:07.762140 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" containerName="nova-scheduler-scheduler" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.306882 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.358998 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.376870 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-config-data\") pod \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.376973 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-logs\") pod \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.377026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-combined-ca-bundle\") pod \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.377130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89pgq\" (UniqueName: \"kubernetes.io/projected/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-kube-api-access-89pgq\") pod \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\" (UID: \"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4\") " Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.378073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-logs" (OuterVolumeSpecName: "logs") pod "afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" (UID: "afb6e2bf-0723-419d-a4f4-e15cc03ec7a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.383386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-kube-api-access-89pgq" (OuterVolumeSpecName: "kube-api-access-89pgq") pod "afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" (UID: "afb6e2bf-0723-419d-a4f4-e15cc03ec7a4"). InnerVolumeSpecName "kube-api-access-89pgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.434357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-config-data" (OuterVolumeSpecName: "config-data") pod "afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" (UID: "afb6e2bf-0723-419d-a4f4-e15cc03ec7a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.434445 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" (UID: "afb6e2bf-0723-419d-a4f4-e15cc03ec7a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.479778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6ed9-2130-43b1-9c92-2939c5f79679-logs\") pod \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.479834 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-combined-ca-bundle\") pod \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.479962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj2mx\" (UniqueName: \"kubernetes.io/projected/9a3e6ed9-2130-43b1-9c92-2939c5f79679-kube-api-access-cj2mx\") pod \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.479989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-config-data\") pod \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\" (UID: \"9a3e6ed9-2130-43b1-9c92-2939c5f79679\") " Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.480322 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89pgq\" (UniqueName: \"kubernetes.io/projected/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-kube-api-access-89pgq\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.480337 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.480345 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.480356 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.481652 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a3e6ed9-2130-43b1-9c92-2939c5f79679-logs" (OuterVolumeSpecName: "logs") pod "9a3e6ed9-2130-43b1-9c92-2939c5f79679" (UID: "9a3e6ed9-2130-43b1-9c92-2939c5f79679"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.507403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3e6ed9-2130-43b1-9c92-2939c5f79679-kube-api-access-cj2mx" (OuterVolumeSpecName: "kube-api-access-cj2mx") pod "9a3e6ed9-2130-43b1-9c92-2939c5f79679" (UID: "9a3e6ed9-2130-43b1-9c92-2939c5f79679"). InnerVolumeSpecName "kube-api-access-cj2mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.531375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-config-data" (OuterVolumeSpecName: "config-data") pod "9a3e6ed9-2130-43b1-9c92-2939c5f79679" (UID: "9a3e6ed9-2130-43b1-9c92-2939c5f79679"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.557367 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a3e6ed9-2130-43b1-9c92-2939c5f79679" (UID: "9a3e6ed9-2130-43b1-9c92-2939c5f79679"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.583219 4749 generic.go:334] "Generic (PLEG): container finished" podID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerID="faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562" exitCode=0 Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.583274 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.583267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4","Type":"ContainerDied","Data":"faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562"} Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.583325 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afb6e2bf-0723-419d-a4f4-e15cc03ec7a4","Type":"ContainerDied","Data":"2669a641e4b7e0ec2c2664d692669d59f1014fc13b4df32d4281c90b99b4615c"} Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.583345 4749 scope.go:117] "RemoveContainer" containerID="faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.584005 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj2mx\" (UniqueName: \"kubernetes.io/projected/9a3e6ed9-2130-43b1-9c92-2939c5f79679-kube-api-access-cj2mx\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.584036 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.584047 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6ed9-2130-43b1-9c92-2939c5f79679-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.584056 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a3e6ed9-2130-43b1-9c92-2939c5f79679-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.588862 4749 generic.go:334] "Generic (PLEG): container finished" podID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerID="b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76" exitCode=0 Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.588899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a3e6ed9-2130-43b1-9c92-2939c5f79679","Type":"ContainerDied","Data":"b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76"} Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.588919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a3e6ed9-2130-43b1-9c92-2939c5f79679","Type":"ContainerDied","Data":"52046438b44981ba9274c3a21538487a28ef027576833840cf565218c3f544d8"} Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.588971 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.611395 4749 scope.go:117] "RemoveContainer" containerID="a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.625249 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.630976 4749 scope.go:117] "RemoveContainer" containerID="faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562" Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.634521 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562\": container with ID starting with faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562 not found: ID does not exist" containerID="faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.634607 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562"} err="failed to get container status \"faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562\": rpc error: code = NotFound desc = could not find container \"faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562\": container with ID starting with faec67a1c237d7918bc8f2fe9ae1be5ab3fefcc021e42556fa3cbde6be93e562 not found: ID does not exist" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.634634 4749 scope.go:117] "RemoveContainer" containerID="a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0" Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.638478 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0\": container with ID starting with a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0 not found: ID does not exist" containerID="a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.638524 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0"} err="failed to get container status \"a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0\": rpc error: code = NotFound desc = could not find container \"a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0\": container with ID starting with a62879e8e7f0f4d63255276397f918e2033732578aa752eaa40939f9909dc9f0 not found: ID does not exist" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.638551 4749 scope.go:117] "RemoveContainer" containerID="b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.656169 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.679665 4749 scope.go:117] "RemoveContainer" containerID="8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.681740 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.715889 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.715913 4749 scope.go:117] "RemoveContainer" containerID="b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76" Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.718381 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76\": container with ID starting with b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76 not found: ID does not exist" containerID="b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.718424 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76"} err="failed to get container status \"b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76\": rpc error: code = NotFound desc = could not find container \"b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76\": container with ID starting with b050ea27a5fa5088871cdf32abb6caace9ccf3efab4a06ffac7c601393fe2a76 not found: ID does not exist" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.718452 4749 scope.go:117] "RemoveContainer" containerID="8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a" Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.719525 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a\": container with ID starting with 8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a not found: ID does not exist" containerID="8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.719561 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a"} err="failed to get container status \"8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a\": rpc error: code = NotFound desc = could not find container \"8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a\": container with ID starting with 8006e65d1acb7f8a324877b9c95e9b626167e84d3c77a5d48f2a51aa81e14a5a not found: ID does not exist" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.727788 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.728146 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-log" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728165 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-log" Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.728192 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-metadata" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728213 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-metadata" Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.728222 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-api" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728228 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-api" Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.728238 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-log" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728243 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-log" Nov 29 02:47:09 crc kubenswrapper[4749]: E1129 02:47:09.728252 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f1aab52-f574-4c9e-aef6-735905de460f" containerName="nova-manage" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728258 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f1aab52-f574-4c9e-aef6-735905de460f" containerName="nova-manage" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728420 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-log" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728439 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f1aab52-f574-4c9e-aef6-735905de460f" containerName="nova-manage" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728451 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" containerName="nova-api-api" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728463 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-log" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.728476 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" containerName="nova-metadata-metadata" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.729383 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.731694 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.734293 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.735450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.738155 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.741815 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.755699 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.789688 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.789852 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39110a88-02a9-4031-a747-6c83d4e69aa4-logs\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.789873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tkhd\" (UniqueName: \"kubernetes.io/projected/39110a88-02a9-4031-a747-6c83d4e69aa4-kube-api-access-4tkhd\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.789926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-config-data\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.891341 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-config-data\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.891475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.891514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.891542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfrxf\" (UniqueName: \"kubernetes.io/projected/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-kube-api-access-qfrxf\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.891563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-logs\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.891642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39110a88-02a9-4031-a747-6c83d4e69aa4-logs\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.891662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tkhd\" (UniqueName: \"kubernetes.io/projected/39110a88-02a9-4031-a747-6c83d4e69aa4-kube-api-access-4tkhd\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.891694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-config-data\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.893494 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39110a88-02a9-4031-a747-6c83d4e69aa4-logs\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.895722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-config-data\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.895727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.912272 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tkhd\" (UniqueName: \"kubernetes.io/projected/39110a88-02a9-4031-a747-6c83d4e69aa4-kube-api-access-4tkhd\") pod \"nova-metadata-0\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " pod="openstack/nova-metadata-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.993658 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.993709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfrxf\" (UniqueName: \"kubernetes.io/projected/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-kube-api-access-qfrxf\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.993731 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-logs\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.993823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-config-data\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.995914 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-logs\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.998325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:09 crc kubenswrapper[4749]: I1129 02:47:09.998848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-config-data\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.013599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfrxf\" (UniqueName: \"kubernetes.io/projected/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-kube-api-access-qfrxf\") pod \"nova-api-0\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " pod="openstack/nova-api-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.089020 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.121025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.133779 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.197283 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-combined-ca-bundle\") pod \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.197352 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-config-data\") pod \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.197467 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmrbv\" (UniqueName: \"kubernetes.io/projected/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-kube-api-access-dmrbv\") pod \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\" (UID: \"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481\") " Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.201128 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-kube-api-access-dmrbv" (OuterVolumeSpecName: "kube-api-access-dmrbv") pod "b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" (UID: "b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481"). InnerVolumeSpecName "kube-api-access-dmrbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.226489 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" (UID: "b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.257276 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-config-data" (OuterVolumeSpecName: "config-data") pod "b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" (UID: "b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.300170 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.300242 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmrbv\" (UniqueName: \"kubernetes.io/projected/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-kube-api-access-dmrbv\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.300263 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.617022 4749 generic.go:334] "Generic (PLEG): container finished" podID="b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" containerID="2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9" exitCode=0 Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.617069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481","Type":"ContainerDied","Data":"2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9"} Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.617117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481","Type":"ContainerDied","Data":"c0bc157434bed3a39a52801a40cc592c348b7e46e0b02f07fdf4a98429f90bd2"} Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.617118 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.617134 4749 scope.go:117] "RemoveContainer" containerID="2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.634255 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.650387 4749 scope.go:117] "RemoveContainer" containerID="2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9" Nov 29 02:47:10 crc kubenswrapper[4749]: E1129 02:47:10.650888 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9\": container with ID starting with 2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9 not found: ID does not exist" containerID="2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.650938 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9"} err="failed to get container status \"2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9\": rpc error: code = NotFound desc = could not find container \"2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9\": container with ID starting with 2c8dc3d2b93d60fd698eb5bba966fbd8d772f2bb2d12c8fa3067dc46f389a7f9 not found: ID does not exist" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.661376 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.672575 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.685288 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:47:10 crc kubenswrapper[4749]: E1129 02:47:10.685713 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" containerName="nova-scheduler-scheduler" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.685737 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" containerName="nova-scheduler-scheduler" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.685954 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" containerName="nova-scheduler-scheduler" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.686635 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.689985 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.697143 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.723671 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:47:10 crc kubenswrapper[4749]: W1129 02:47:10.725650 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39110a88_02a9_4031_a747_6c83d4e69aa4.slice/crio-a9e4a9e9471c925a6916a9b55cf799ac4d6bb6933738a41b0c7600c9ac5b7e9c WatchSource:0}: Error finding container a9e4a9e9471c925a6916a9b55cf799ac4d6bb6933738a41b0c7600c9ac5b7e9c: Status 404 returned error can't find the container with id a9e4a9e9471c925a6916a9b55cf799ac4d6bb6933738a41b0c7600c9ac5b7e9c Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.814611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-config-data\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.815040 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.815258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6kj\" (UniqueName: \"kubernetes.io/projected/820581a7-a881-4272-b198-b25ad45892dd-kube-api-access-7c6kj\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.916682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-config-data\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.916736 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.916831 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6kj\" (UniqueName: \"kubernetes.io/projected/820581a7-a881-4272-b198-b25ad45892dd-kube-api-access-7c6kj\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.920866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-config-data\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.924522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:10 crc kubenswrapper[4749]: I1129 02:47:10.932290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6kj\" (UniqueName: \"kubernetes.io/projected/820581a7-a881-4272-b198-b25ad45892dd-kube-api-access-7c6kj\") pod \"nova-scheduler-0\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " pod="openstack/nova-scheduler-0" Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.005265 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.095376 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3e6ed9-2130-43b1-9c92-2939c5f79679" path="/var/lib/kubelet/pods/9a3e6ed9-2130-43b1-9c92-2939c5f79679/volumes" Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.096255 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb6e2bf-0723-419d-a4f4-e15cc03ec7a4" path="/var/lib/kubelet/pods/afb6e2bf-0723-419d-a4f4-e15cc03ec7a4/volumes" Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.096948 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481" path="/var/lib/kubelet/pods/b2cca797-3b4a-4fe1-a7ea-bb2fe54d1481/volumes" Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.476890 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:47:11 crc kubenswrapper[4749]: W1129 02:47:11.481473 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod820581a7_a881_4272_b198_b25ad45892dd.slice/crio-0a090605667210cf9cb84557714a2947352097440b59210e6dce5f812da61f5f WatchSource:0}: Error finding container 0a090605667210cf9cb84557714a2947352097440b59210e6dce5f812da61f5f: Status 404 returned error can't find the container with id 0a090605667210cf9cb84557714a2947352097440b59210e6dce5f812da61f5f Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.632985 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1","Type":"ContainerStarted","Data":"48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7"} Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.633358 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1","Type":"ContainerStarted","Data":"4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659"} Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.633382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1","Type":"ContainerStarted","Data":"29ddd33d3f34ed4fd370ef538bcc1293bbfee81e9b490fd8e2be35345b9e1307"} Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.636173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39110a88-02a9-4031-a747-6c83d4e69aa4","Type":"ContainerStarted","Data":"52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035"} Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.636249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39110a88-02a9-4031-a747-6c83d4e69aa4","Type":"ContainerStarted","Data":"2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3"} Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.636268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39110a88-02a9-4031-a747-6c83d4e69aa4","Type":"ContainerStarted","Data":"a9e4a9e9471c925a6916a9b55cf799ac4d6bb6933738a41b0c7600c9ac5b7e9c"} Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.640840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"820581a7-a881-4272-b198-b25ad45892dd","Type":"ContainerStarted","Data":"0a090605667210cf9cb84557714a2947352097440b59210e6dce5f812da61f5f"} Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.654141 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.654123753 podStartE2EDuration="2.654123753s" podCreationTimestamp="2025-11-29 02:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:11.652575696 +0000 UTC m=+5774.824725593" watchObservedRunningTime="2025-11-29 02:47:11.654123753 +0000 UTC m=+5774.826273620" Nov 29 02:47:11 crc kubenswrapper[4749]: I1129 02:47:11.680119 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.680102415 podStartE2EDuration="2.680102415s" podCreationTimestamp="2025-11-29 02:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:11.6757984 +0000 UTC m=+5774.847948267" watchObservedRunningTime="2025-11-29 02:47:11.680102415 +0000 UTC m=+5774.852252282" Nov 29 02:47:12 crc kubenswrapper[4749]: I1129 02:47:12.075571 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:47:12 crc kubenswrapper[4749]: E1129 02:47:12.076119 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:47:12 crc kubenswrapper[4749]: I1129 02:47:12.660924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"820581a7-a881-4272-b198-b25ad45892dd","Type":"ContainerStarted","Data":"8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9"} Nov 29 02:47:12 crc kubenswrapper[4749]: I1129 02:47:12.689334 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.689309534 podStartE2EDuration="2.689309534s" podCreationTimestamp="2025-11-29 02:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:12.681348961 +0000 UTC m=+5775.853498828" watchObservedRunningTime="2025-11-29 02:47:12.689309534 +0000 UTC m=+5775.861459401" Nov 29 02:47:15 crc kubenswrapper[4749]: I1129 02:47:15.094056 4749 scope.go:117] "RemoveContainer" containerID="2553c087b805afe93c9582071d4686e8977af0232ee379015f8df1a9a6ebe9d6" Nov 29 02:47:15 crc kubenswrapper[4749]: I1129 02:47:15.121559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 02:47:15 crc kubenswrapper[4749]: I1129 02:47:15.124101 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 02:47:16 crc kubenswrapper[4749]: I1129 02:47:16.005570 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 02:47:20 crc kubenswrapper[4749]: I1129 02:47:20.121850 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 02:47:20 crc kubenswrapper[4749]: I1129 02:47:20.122554 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 02:47:20 crc kubenswrapper[4749]: I1129 02:47:20.134566 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 02:47:20 crc kubenswrapper[4749]: I1129 02:47:20.134636 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 02:47:21 crc kubenswrapper[4749]: I1129 02:47:21.005810 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 02:47:21 crc kubenswrapper[4749]: I1129 02:47:21.049024 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 02:47:21 crc kubenswrapper[4749]: I1129 02:47:21.204424 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:47:21 crc kubenswrapper[4749]: I1129 02:47:21.286406 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:47:21 crc kubenswrapper[4749]: I1129 02:47:21.286420 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:47:21 crc kubenswrapper[4749]: I1129 02:47:21.287018 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:47:21 crc kubenswrapper[4749]: I1129 02:47:21.849980 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 02:47:23 crc kubenswrapper[4749]: I1129 02:47:23.075998 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:47:23 crc kubenswrapper[4749]: E1129 02:47:23.076462 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.130551 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.140469 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.141666 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.148367 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.148888 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.151041 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.161493 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.898038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.901256 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 02:47:30 crc kubenswrapper[4749]: I1129 02:47:30.902779 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.165839 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86b44557d5-4jttm"] Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.167679 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.189088 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b44557d5-4jttm"] Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.252016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2x87\" (UniqueName: \"kubernetes.io/projected/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-kube-api-access-d2x87\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.252060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-config\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.252137 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-sb\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.252314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-dns-svc\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.252423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-nb\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.354688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-nb\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.354795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2x87\" (UniqueName: \"kubernetes.io/projected/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-kube-api-access-d2x87\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.354830 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-config\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.354877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-sb\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.354968 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-dns-svc\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.356004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-dns-svc\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.356801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-nb\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.357975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-sb\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.358604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-config\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.379186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2x87\" (UniqueName: \"kubernetes.io/projected/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-kube-api-access-d2x87\") pod \"dnsmasq-dns-86b44557d5-4jttm\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.492732 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:31 crc kubenswrapper[4749]: I1129 02:47:31.944831 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b44557d5-4jttm"] Nov 29 02:47:32 crc kubenswrapper[4749]: I1129 02:47:32.916764 4749 generic.go:334] "Generic (PLEG): container finished" podID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" containerID="021d9a17c74428f8f0fab71f44692d1ed82e733f5c3aca0a6bf7d38c7b7c8aea" exitCode=0 Nov 29 02:47:32 crc kubenswrapper[4749]: I1129 02:47:32.918478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" event={"ID":"007bfd49-61bd-4de4-a8fc-f1fefafa5b15","Type":"ContainerDied","Data":"021d9a17c74428f8f0fab71f44692d1ed82e733f5c3aca0a6bf7d38c7b7c8aea"} Nov 29 02:47:32 crc kubenswrapper[4749]: I1129 02:47:32.918517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" event={"ID":"007bfd49-61bd-4de4-a8fc-f1fefafa5b15","Type":"ContainerStarted","Data":"574b18098f1d23338baf5b9fe471a2064cbe27c9aa3590c4fada5f4b12cdebdb"} Nov 29 02:47:33 crc kubenswrapper[4749]: I1129 02:47:33.926955 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" event={"ID":"007bfd49-61bd-4de4-a8fc-f1fefafa5b15","Type":"ContainerStarted","Data":"94ba86033c46619795d94b063459b0bbc363ecdfaf469fc43b289b32e25721aa"} Nov 29 02:47:33 crc kubenswrapper[4749]: I1129 02:47:33.927907 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:33 crc kubenswrapper[4749]: I1129 02:47:33.946590 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" podStartSLOduration=2.9465725149999997 podStartE2EDuration="2.946572515s" podCreationTimestamp="2025-11-29 02:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:33.944635138 +0000 UTC m=+5797.116785045" watchObservedRunningTime="2025-11-29 02:47:33.946572515 +0000 UTC m=+5797.118722372" Nov 29 02:47:35 crc kubenswrapper[4749]: I1129 02:47:35.075697 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:47:35 crc kubenswrapper[4749]: E1129 02:47:35.076135 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:47:41 crc kubenswrapper[4749]: I1129 02:47:41.495566 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:47:41 crc kubenswrapper[4749]: I1129 02:47:41.572950 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb9ccdbc-x84nj"] Nov 29 02:47:41 crc kubenswrapper[4749]: I1129 02:47:41.573289 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" podUID="aaa73ce0-0bac-482b-9076-cc55df3efd74" containerName="dnsmasq-dns" containerID="cri-o://dd4ca9ae891caac20eb8b1abc5492777fdf1a7e82d7113e249b87586b02b0e5d" gracePeriod=10 Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.013068 4749 generic.go:334] "Generic (PLEG): container finished" podID="aaa73ce0-0bac-482b-9076-cc55df3efd74" containerID="dd4ca9ae891caac20eb8b1abc5492777fdf1a7e82d7113e249b87586b02b0e5d" exitCode=0 Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.013265 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" event={"ID":"aaa73ce0-0bac-482b-9076-cc55df3efd74","Type":"ContainerDied","Data":"dd4ca9ae891caac20eb8b1abc5492777fdf1a7e82d7113e249b87586b02b0e5d"} Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.130094 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.167629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd42c\" (UniqueName: \"kubernetes.io/projected/aaa73ce0-0bac-482b-9076-cc55df3efd74-kube-api-access-kd42c\") pod \"aaa73ce0-0bac-482b-9076-cc55df3efd74\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.167725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-sb\") pod \"aaa73ce0-0bac-482b-9076-cc55df3efd74\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.167850 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-nb\") pod \"aaa73ce0-0bac-482b-9076-cc55df3efd74\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.167958 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-dns-svc\") pod \"aaa73ce0-0bac-482b-9076-cc55df3efd74\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.168699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-config\") pod \"aaa73ce0-0bac-482b-9076-cc55df3efd74\" (UID: \"aaa73ce0-0bac-482b-9076-cc55df3efd74\") " Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.184900 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa73ce0-0bac-482b-9076-cc55df3efd74-kube-api-access-kd42c" (OuterVolumeSpecName: "kube-api-access-kd42c") pod "aaa73ce0-0bac-482b-9076-cc55df3efd74" (UID: "aaa73ce0-0bac-482b-9076-cc55df3efd74"). InnerVolumeSpecName "kube-api-access-kd42c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.224026 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aaa73ce0-0bac-482b-9076-cc55df3efd74" (UID: "aaa73ce0-0bac-482b-9076-cc55df3efd74"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.224606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aaa73ce0-0bac-482b-9076-cc55df3efd74" (UID: "aaa73ce0-0bac-482b-9076-cc55df3efd74"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.226581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-config" (OuterVolumeSpecName: "config") pod "aaa73ce0-0bac-482b-9076-cc55df3efd74" (UID: "aaa73ce0-0bac-482b-9076-cc55df3efd74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.239287 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aaa73ce0-0bac-482b-9076-cc55df3efd74" (UID: "aaa73ce0-0bac-482b-9076-cc55df3efd74"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.272692 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.272967 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.273046 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.273152 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd42c\" (UniqueName: \"kubernetes.io/projected/aaa73ce0-0bac-482b-9076-cc55df3efd74-kube-api-access-kd42c\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:42 crc kubenswrapper[4749]: I1129 02:47:42.273331 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaa73ce0-0bac-482b-9076-cc55df3efd74-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:43 crc kubenswrapper[4749]: I1129 02:47:43.028921 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" event={"ID":"aaa73ce0-0bac-482b-9076-cc55df3efd74","Type":"ContainerDied","Data":"c185342d86f829890b2427ece89394cabca6b45b7fdf5544382636255474b67d"} Nov 29 02:47:43 crc kubenswrapper[4749]: I1129 02:47:43.029000 4749 scope.go:117] "RemoveContainer" containerID="dd4ca9ae891caac20eb8b1abc5492777fdf1a7e82d7113e249b87586b02b0e5d" Nov 29 02:47:43 crc kubenswrapper[4749]: I1129 02:47:43.029182 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb9ccdbc-x84nj" Nov 29 02:47:43 crc kubenswrapper[4749]: I1129 02:47:43.069162 4749 scope.go:117] "RemoveContainer" containerID="2f717d614f2651123b57d772dc39cad53b029c47f96543eb5859af336854dcf1" Nov 29 02:47:43 crc kubenswrapper[4749]: I1129 02:47:43.107998 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb9ccdbc-x84nj"] Nov 29 02:47:43 crc kubenswrapper[4749]: I1129 02:47:43.108059 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb9ccdbc-x84nj"] Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.055152 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-k877z"] Nov 29 02:47:44 crc kubenswrapper[4749]: E1129 02:47:44.056329 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa73ce0-0bac-482b-9076-cc55df3efd74" containerName="init" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.056367 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa73ce0-0bac-482b-9076-cc55df3efd74" containerName="init" Nov 29 02:47:44 crc kubenswrapper[4749]: E1129 02:47:44.056419 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa73ce0-0bac-482b-9076-cc55df3efd74" containerName="dnsmasq-dns" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.056432 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa73ce0-0bac-482b-9076-cc55df3efd74" containerName="dnsmasq-dns" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.062492 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa73ce0-0bac-482b-9076-cc55df3efd74" containerName="dnsmasq-dns" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.065518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k877z" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.077916 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k877z"] Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.109936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svjtt\" (UniqueName: \"kubernetes.io/projected/2e39d797-8cc7-40f1-8929-2ab733b4da0b-kube-api-access-svjtt\") pod \"cinder-db-create-k877z\" (UID: \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\") " pod="openstack/cinder-db-create-k877z" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.110030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e39d797-8cc7-40f1-8929-2ab733b4da0b-operator-scripts\") pod \"cinder-db-create-k877z\" (UID: \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\") " pod="openstack/cinder-db-create-k877z" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.156869 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b1ac-account-create-update-rg4th"] Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.158007 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.162863 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.172666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b1ac-account-create-update-rg4th"] Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.211563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9f351a6-2f7a-48f1-9b19-1c73da851367-operator-scripts\") pod \"cinder-b1ac-account-create-update-rg4th\" (UID: \"e9f351a6-2f7a-48f1-9b19-1c73da851367\") " pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.211674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnxbq\" (UniqueName: \"kubernetes.io/projected/e9f351a6-2f7a-48f1-9b19-1c73da851367-kube-api-access-cnxbq\") pod \"cinder-b1ac-account-create-update-rg4th\" (UID: \"e9f351a6-2f7a-48f1-9b19-1c73da851367\") " pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.211699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svjtt\" (UniqueName: \"kubernetes.io/projected/2e39d797-8cc7-40f1-8929-2ab733b4da0b-kube-api-access-svjtt\") pod \"cinder-db-create-k877z\" (UID: \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\") " pod="openstack/cinder-db-create-k877z" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.211719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e39d797-8cc7-40f1-8929-2ab733b4da0b-operator-scripts\") pod \"cinder-db-create-k877z\" (UID: \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\") " pod="openstack/cinder-db-create-k877z" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.212333 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e39d797-8cc7-40f1-8929-2ab733b4da0b-operator-scripts\") pod \"cinder-db-create-k877z\" (UID: \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\") " pod="openstack/cinder-db-create-k877z" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.234917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svjtt\" (UniqueName: \"kubernetes.io/projected/2e39d797-8cc7-40f1-8929-2ab733b4da0b-kube-api-access-svjtt\") pod \"cinder-db-create-k877z\" (UID: \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\") " pod="openstack/cinder-db-create-k877z" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.312509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnxbq\" (UniqueName: \"kubernetes.io/projected/e9f351a6-2f7a-48f1-9b19-1c73da851367-kube-api-access-cnxbq\") pod \"cinder-b1ac-account-create-update-rg4th\" (UID: \"e9f351a6-2f7a-48f1-9b19-1c73da851367\") " pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.312644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9f351a6-2f7a-48f1-9b19-1c73da851367-operator-scripts\") pod \"cinder-b1ac-account-create-update-rg4th\" (UID: \"e9f351a6-2f7a-48f1-9b19-1c73da851367\") " pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.314302 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9f351a6-2f7a-48f1-9b19-1c73da851367-operator-scripts\") pod \"cinder-b1ac-account-create-update-rg4th\" (UID: \"e9f351a6-2f7a-48f1-9b19-1c73da851367\") " pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.341246 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnxbq\" (UniqueName: \"kubernetes.io/projected/e9f351a6-2f7a-48f1-9b19-1c73da851367-kube-api-access-cnxbq\") pod \"cinder-b1ac-account-create-update-rg4th\" (UID: \"e9f351a6-2f7a-48f1-9b19-1c73da851367\") " pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.389687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k877z" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.474781 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.806398 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b1ac-account-create-update-rg4th"] Nov 29 02:47:44 crc kubenswrapper[4749]: I1129 02:47:44.860360 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-k877z"] Nov 29 02:47:44 crc kubenswrapper[4749]: W1129 02:47:44.860979 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e39d797_8cc7_40f1_8929_2ab733b4da0b.slice/crio-9851f824636655af9f328d64b53e84c0792bf2b25b30215be4356adbae91c1cb WatchSource:0}: Error finding container 9851f824636655af9f328d64b53e84c0792bf2b25b30215be4356adbae91c1cb: Status 404 returned error can't find the container with id 9851f824636655af9f328d64b53e84c0792bf2b25b30215be4356adbae91c1cb Nov 29 02:47:45 crc kubenswrapper[4749]: I1129 02:47:45.060077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b1ac-account-create-update-rg4th" event={"ID":"e9f351a6-2f7a-48f1-9b19-1c73da851367","Type":"ContainerStarted","Data":"d05157f45661b784be5a49101ed8c0aae65a5da610802f1aaf74b12a0c327ebf"} Nov 29 02:47:45 crc kubenswrapper[4749]: I1129 02:47:45.060158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b1ac-account-create-update-rg4th" event={"ID":"e9f351a6-2f7a-48f1-9b19-1c73da851367","Type":"ContainerStarted","Data":"694ee90dd9aef16a0f6e4cb73d513184e2e2db29c618ce6a28a9238b3e1a2851"} Nov 29 02:47:45 crc kubenswrapper[4749]: I1129 02:47:45.065084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k877z" event={"ID":"2e39d797-8cc7-40f1-8929-2ab733b4da0b","Type":"ContainerStarted","Data":"d69572105badca04be2140e2b31841c6d87c36fb1b80f0e79f541dd20d59aa7c"} Nov 29 02:47:45 crc kubenswrapper[4749]: I1129 02:47:45.065134 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k877z" event={"ID":"2e39d797-8cc7-40f1-8929-2ab733b4da0b","Type":"ContainerStarted","Data":"9851f824636655af9f328d64b53e84c0792bf2b25b30215be4356adbae91c1cb"} Nov 29 02:47:45 crc kubenswrapper[4749]: I1129 02:47:45.087472 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b1ac-account-create-update-rg4th" podStartSLOduration=1.087453836 podStartE2EDuration="1.087453836s" podCreationTimestamp="2025-11-29 02:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:45.076367787 +0000 UTC m=+5808.248517664" watchObservedRunningTime="2025-11-29 02:47:45.087453836 +0000 UTC m=+5808.259603703" Nov 29 02:47:45 crc kubenswrapper[4749]: I1129 02:47:45.097270 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa73ce0-0bac-482b-9076-cc55df3efd74" path="/var/lib/kubelet/pods/aaa73ce0-0bac-482b-9076-cc55df3efd74/volumes" Nov 29 02:47:45 crc kubenswrapper[4749]: I1129 02:47:45.112514 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-k877z" podStartSLOduration=1.112489085 podStartE2EDuration="1.112489085s" podCreationTimestamp="2025-11-29 02:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:45.107929364 +0000 UTC m=+5808.280079251" watchObservedRunningTime="2025-11-29 02:47:45.112489085 +0000 UTC m=+5808.284638962" Nov 29 02:47:46 crc kubenswrapper[4749]: I1129 02:47:46.080259 4749 generic.go:334] "Generic (PLEG): container finished" podID="2e39d797-8cc7-40f1-8929-2ab733b4da0b" containerID="d69572105badca04be2140e2b31841c6d87c36fb1b80f0e79f541dd20d59aa7c" exitCode=0 Nov 29 02:47:46 crc kubenswrapper[4749]: I1129 02:47:46.080333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k877z" event={"ID":"2e39d797-8cc7-40f1-8929-2ab733b4da0b","Type":"ContainerDied","Data":"d69572105badca04be2140e2b31841c6d87c36fb1b80f0e79f541dd20d59aa7c"} Nov 29 02:47:46 crc kubenswrapper[4749]: I1129 02:47:46.084104 4749 generic.go:334] "Generic (PLEG): container finished" podID="e9f351a6-2f7a-48f1-9b19-1c73da851367" containerID="d05157f45661b784be5a49101ed8c0aae65a5da610802f1aaf74b12a0c327ebf" exitCode=0 Nov 29 02:47:46 crc kubenswrapper[4749]: I1129 02:47:46.084162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b1ac-account-create-update-rg4th" event={"ID":"e9f351a6-2f7a-48f1-9b19-1c73da851367","Type":"ContainerDied","Data":"d05157f45661b784be5a49101ed8c0aae65a5da610802f1aaf74b12a0c327ebf"} Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.095300 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:47:47 crc kubenswrapper[4749]: E1129 02:47:47.096728 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.648750 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.653327 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k877z" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.674828 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnxbq\" (UniqueName: \"kubernetes.io/projected/e9f351a6-2f7a-48f1-9b19-1c73da851367-kube-api-access-cnxbq\") pod \"e9f351a6-2f7a-48f1-9b19-1c73da851367\" (UID: \"e9f351a6-2f7a-48f1-9b19-1c73da851367\") " Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.677016 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9f351a6-2f7a-48f1-9b19-1c73da851367-operator-scripts\") pod \"e9f351a6-2f7a-48f1-9b19-1c73da851367\" (UID: \"e9f351a6-2f7a-48f1-9b19-1c73da851367\") " Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.677095 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e39d797-8cc7-40f1-8929-2ab733b4da0b-operator-scripts\") pod \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\" (UID: \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\") " Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.677298 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svjtt\" (UniqueName: \"kubernetes.io/projected/2e39d797-8cc7-40f1-8929-2ab733b4da0b-kube-api-access-svjtt\") pod \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\" (UID: \"2e39d797-8cc7-40f1-8929-2ab733b4da0b\") " Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.677814 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e39d797-8cc7-40f1-8929-2ab733b4da0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e39d797-8cc7-40f1-8929-2ab733b4da0b" (UID: "2e39d797-8cc7-40f1-8929-2ab733b4da0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.678321 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e39d797-8cc7-40f1-8929-2ab733b4da0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.679208 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9f351a6-2f7a-48f1-9b19-1c73da851367-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9f351a6-2f7a-48f1-9b19-1c73da851367" (UID: "e9f351a6-2f7a-48f1-9b19-1c73da851367"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.681617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f351a6-2f7a-48f1-9b19-1c73da851367-kube-api-access-cnxbq" (OuterVolumeSpecName: "kube-api-access-cnxbq") pod "e9f351a6-2f7a-48f1-9b19-1c73da851367" (UID: "e9f351a6-2f7a-48f1-9b19-1c73da851367"). InnerVolumeSpecName "kube-api-access-cnxbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.694948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e39d797-8cc7-40f1-8929-2ab733b4da0b-kube-api-access-svjtt" (OuterVolumeSpecName: "kube-api-access-svjtt") pod "2e39d797-8cc7-40f1-8929-2ab733b4da0b" (UID: "2e39d797-8cc7-40f1-8929-2ab733b4da0b"). InnerVolumeSpecName "kube-api-access-svjtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.779932 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9f351a6-2f7a-48f1-9b19-1c73da851367-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.779968 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svjtt\" (UniqueName: \"kubernetes.io/projected/2e39d797-8cc7-40f1-8929-2ab733b4da0b-kube-api-access-svjtt\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:47 crc kubenswrapper[4749]: I1129 02:47:47.779979 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnxbq\" (UniqueName: \"kubernetes.io/projected/e9f351a6-2f7a-48f1-9b19-1c73da851367-kube-api-access-cnxbq\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:48 crc kubenswrapper[4749]: I1129 02:47:48.114033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b1ac-account-create-update-rg4th" event={"ID":"e9f351a6-2f7a-48f1-9b19-1c73da851367","Type":"ContainerDied","Data":"694ee90dd9aef16a0f6e4cb73d513184e2e2db29c618ce6a28a9238b3e1a2851"} Nov 29 02:47:48 crc kubenswrapper[4749]: I1129 02:47:48.114074 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b1ac-account-create-update-rg4th" Nov 29 02:47:48 crc kubenswrapper[4749]: I1129 02:47:48.114146 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="694ee90dd9aef16a0f6e4cb73d513184e2e2db29c618ce6a28a9238b3e1a2851" Nov 29 02:47:48 crc kubenswrapper[4749]: I1129 02:47:48.117352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-k877z" event={"ID":"2e39d797-8cc7-40f1-8929-2ab733b4da0b","Type":"ContainerDied","Data":"9851f824636655af9f328d64b53e84c0792bf2b25b30215be4356adbae91c1cb"} Nov 29 02:47:48 crc kubenswrapper[4749]: I1129 02:47:48.117397 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9851f824636655af9f328d64b53e84c0792bf2b25b30215be4356adbae91c1cb" Nov 29 02:47:48 crc kubenswrapper[4749]: I1129 02:47:48.117446 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-k877z" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.338536 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-brvj2"] Nov 29 02:47:49 crc kubenswrapper[4749]: E1129 02:47:49.340718 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e39d797-8cc7-40f1-8929-2ab733b4da0b" containerName="mariadb-database-create" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.340876 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e39d797-8cc7-40f1-8929-2ab733b4da0b" containerName="mariadb-database-create" Nov 29 02:47:49 crc kubenswrapper[4749]: E1129 02:47:49.340992 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f351a6-2f7a-48f1-9b19-1c73da851367" containerName="mariadb-account-create-update" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.341094 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f351a6-2f7a-48f1-9b19-1c73da851367" containerName="mariadb-account-create-update" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.341589 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e39d797-8cc7-40f1-8929-2ab733b4da0b" containerName="mariadb-database-create" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.341750 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f351a6-2f7a-48f1-9b19-1c73da851367" containerName="mariadb-account-create-update" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.342794 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.345525 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.345691 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-67qdb" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.345704 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.355587 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-brvj2"] Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.425559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w694l\" (UniqueName: \"kubernetes.io/projected/d7c3f510-09bd-45e0-afe6-9f4c3753d311-kube-api-access-w694l\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.425634 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-combined-ca-bundle\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.425694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-config-data\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.425733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-db-sync-config-data\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.425818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7c3f510-09bd-45e0-afe6-9f4c3753d311-etc-machine-id\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.426030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-scripts\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.527750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-db-sync-config-data\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.527843 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7c3f510-09bd-45e0-afe6-9f4c3753d311-etc-machine-id\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.527954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-scripts\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.528037 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w694l\" (UniqueName: \"kubernetes.io/projected/d7c3f510-09bd-45e0-afe6-9f4c3753d311-kube-api-access-w694l\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.528091 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-combined-ca-bundle\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.528150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-config-data\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.529144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7c3f510-09bd-45e0-afe6-9f4c3753d311-etc-machine-id\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.535014 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-db-sync-config-data\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.536723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-combined-ca-bundle\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.537096 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-config-data\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.541959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-scripts\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.551913 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w694l\" (UniqueName: \"kubernetes.io/projected/d7c3f510-09bd-45e0-afe6-9f4c3753d311-kube-api-access-w694l\") pod \"cinder-db-sync-brvj2\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:49 crc kubenswrapper[4749]: I1129 02:47:49.666878 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:50 crc kubenswrapper[4749]: I1129 02:47:50.163780 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-brvj2"] Nov 29 02:47:50 crc kubenswrapper[4749]: W1129 02:47:50.168303 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7c3f510_09bd_45e0_afe6_9f4c3753d311.slice/crio-28dba10f68abc1cce412bbdd5179c120907ef389711c0cfa9cc7c093240a6429 WatchSource:0}: Error finding container 28dba10f68abc1cce412bbdd5179c120907ef389711c0cfa9cc7c093240a6429: Status 404 returned error can't find the container with id 28dba10f68abc1cce412bbdd5179c120907ef389711c0cfa9cc7c093240a6429 Nov 29 02:47:51 crc kubenswrapper[4749]: I1129 02:47:51.156117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-brvj2" event={"ID":"d7c3f510-09bd-45e0-afe6-9f4c3753d311","Type":"ContainerStarted","Data":"f6b1f13821cb38acb978d353217b93cb261b1eb3f56ab50d72b169d81ee37190"} Nov 29 02:47:51 crc kubenswrapper[4749]: I1129 02:47:51.156989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-brvj2" event={"ID":"d7c3f510-09bd-45e0-afe6-9f4c3753d311","Type":"ContainerStarted","Data":"28dba10f68abc1cce412bbdd5179c120907ef389711c0cfa9cc7c093240a6429"} Nov 29 02:47:51 crc kubenswrapper[4749]: I1129 02:47:51.199728 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-brvj2" podStartSLOduration=2.199706705 podStartE2EDuration="2.199706705s" podCreationTimestamp="2025-11-29 02:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:51.189496297 +0000 UTC m=+5814.361646184" watchObservedRunningTime="2025-11-29 02:47:51.199706705 +0000 UTC m=+5814.371856572" Nov 29 02:47:54 crc kubenswrapper[4749]: I1129 02:47:54.189778 4749 generic.go:334] "Generic (PLEG): container finished" podID="d7c3f510-09bd-45e0-afe6-9f4c3753d311" containerID="f6b1f13821cb38acb978d353217b93cb261b1eb3f56ab50d72b169d81ee37190" exitCode=0 Nov 29 02:47:54 crc kubenswrapper[4749]: I1129 02:47:54.189871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-brvj2" event={"ID":"d7c3f510-09bd-45e0-afe6-9f4c3753d311","Type":"ContainerDied","Data":"f6b1f13821cb38acb978d353217b93cb261b1eb3f56ab50d72b169d81ee37190"} Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.623219 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.775128 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-scripts\") pod \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.775225 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-config-data\") pod \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.775279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-combined-ca-bundle\") pod \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.775382 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7c3f510-09bd-45e0-afe6-9f4c3753d311-etc-machine-id\") pod \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.775469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w694l\" (UniqueName: \"kubernetes.io/projected/d7c3f510-09bd-45e0-afe6-9f4c3753d311-kube-api-access-w694l\") pod \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.775553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-db-sync-config-data\") pod \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\" (UID: \"d7c3f510-09bd-45e0-afe6-9f4c3753d311\") " Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.775679 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7c3f510-09bd-45e0-afe6-9f4c3753d311-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d7c3f510-09bd-45e0-afe6-9f4c3753d311" (UID: "d7c3f510-09bd-45e0-afe6-9f4c3753d311"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.776368 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7c3f510-09bd-45e0-afe6-9f4c3753d311-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.798403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d7c3f510-09bd-45e0-afe6-9f4c3753d311" (UID: "d7c3f510-09bd-45e0-afe6-9f4c3753d311"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.798680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c3f510-09bd-45e0-afe6-9f4c3753d311-kube-api-access-w694l" (OuterVolumeSpecName: "kube-api-access-w694l") pod "d7c3f510-09bd-45e0-afe6-9f4c3753d311" (UID: "d7c3f510-09bd-45e0-afe6-9f4c3753d311"). InnerVolumeSpecName "kube-api-access-w694l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.800720 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-scripts" (OuterVolumeSpecName: "scripts") pod "d7c3f510-09bd-45e0-afe6-9f4c3753d311" (UID: "d7c3f510-09bd-45e0-afe6-9f4c3753d311"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.819122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7c3f510-09bd-45e0-afe6-9f4c3753d311" (UID: "d7c3f510-09bd-45e0-afe6-9f4c3753d311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.848093 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-config-data" (OuterVolumeSpecName: "config-data") pod "d7c3f510-09bd-45e0-afe6-9f4c3753d311" (UID: "d7c3f510-09bd-45e0-afe6-9f4c3753d311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.877784 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w694l\" (UniqueName: \"kubernetes.io/projected/d7c3f510-09bd-45e0-afe6-9f4c3753d311-kube-api-access-w694l\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.877811 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.877821 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.877829 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:55 crc kubenswrapper[4749]: I1129 02:47:55.877839 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c3f510-09bd-45e0-afe6-9f4c3753d311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.217374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-brvj2" event={"ID":"d7c3f510-09bd-45e0-afe6-9f4c3753d311","Type":"ContainerDied","Data":"28dba10f68abc1cce412bbdd5179c120907ef389711c0cfa9cc7c093240a6429"} Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.217848 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28dba10f68abc1cce412bbdd5179c120907ef389711c0cfa9cc7c093240a6429" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.217505 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-brvj2" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.664827 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bc657d5c-djh2c"] Nov 29 02:47:56 crc kubenswrapper[4749]: E1129 02:47:56.665504 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c3f510-09bd-45e0-afe6-9f4c3753d311" containerName="cinder-db-sync" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.665516 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c3f510-09bd-45e0-afe6-9f4c3753d311" containerName="cinder-db-sync" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.665696 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c3f510-09bd-45e0-afe6-9f4c3753d311" containerName="cinder-db-sync" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.666698 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.676540 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bc657d5c-djh2c"] Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.799925 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dqbn\" (UniqueName: \"kubernetes.io/projected/920be425-9042-4830-8468-6dd624a20d43-kube-api-access-9dqbn\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.799969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-sb\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.800007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-nb\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.800131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-config\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.800257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-dns-svc\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.844427 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.845944 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.848695 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-67qdb" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.848964 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.849080 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.854570 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.856054 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.901069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dqbn\" (UniqueName: \"kubernetes.io/projected/920be425-9042-4830-8468-6dd624a20d43-kube-api-access-9dqbn\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.901112 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-sb\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.901142 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-nb\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.901178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-config\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.901234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-dns-svc\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.902063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-dns-svc\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.902788 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-sb\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.903309 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-nb\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.903786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-config\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.921973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dqbn\" (UniqueName: \"kubernetes.io/projected/920be425-9042-4830-8468-6dd624a20d43-kube-api-access-9dqbn\") pod \"dnsmasq-dns-57bc657d5c-djh2c\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:56 crc kubenswrapper[4749]: I1129 02:47:56.988441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.002959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.003029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea6d65f-84ae-4ea9-8970-41d287e2f672-logs\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.003064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2r7p\" (UniqueName: \"kubernetes.io/projected/aea6d65f-84ae-4ea9-8970-41d287e2f672-kube-api-access-l2r7p\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.003090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.003170 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aea6d65f-84ae-4ea9-8970-41d287e2f672-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.003219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-scripts\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.003279 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data-custom\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.105998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.106254 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea6d65f-84ae-4ea9-8970-41d287e2f672-logs\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.106278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2r7p\" (UniqueName: \"kubernetes.io/projected/aea6d65f-84ae-4ea9-8970-41d287e2f672-kube-api-access-l2r7p\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.106298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.106325 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aea6d65f-84ae-4ea9-8970-41d287e2f672-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.106338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-scripts\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.106376 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data-custom\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.108009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aea6d65f-84ae-4ea9-8970-41d287e2f672-etc-machine-id\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.108609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea6d65f-84ae-4ea9-8970-41d287e2f672-logs\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.110973 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.111290 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.114081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data-custom\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.122974 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-scripts\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.123463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.126866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.127375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2r7p\" (UniqueName: \"kubernetes.io/projected/aea6d65f-84ae-4ea9-8970-41d287e2f672-kube-api-access-l2r7p\") pod \"cinder-api-0\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.198640 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-67qdb" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.208084 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.489349 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bc657d5c-djh2c"] Nov 29 02:47:57 crc kubenswrapper[4749]: W1129 02:47:57.830805 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea6d65f_84ae_4ea9_8970_41d287e2f672.slice/crio-95d48b23ad7301b8276dd6d0c9caa3fe2c8d375084014a3998d508e58d72f8a8 WatchSource:0}: Error finding container 95d48b23ad7301b8276dd6d0c9caa3fe2c8d375084014a3998d508e58d72f8a8: Status 404 returned error can't find the container with id 95d48b23ad7301b8276dd6d0c9caa3fe2c8d375084014a3998d508e58d72f8a8 Nov 29 02:47:57 crc kubenswrapper[4749]: I1129 02:47:57.833600 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:47:58 crc kubenswrapper[4749]: I1129 02:47:58.075666 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:47:58 crc kubenswrapper[4749]: E1129 02:47:58.076260 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:47:58 crc kubenswrapper[4749]: I1129 02:47:58.259011 4749 generic.go:334] "Generic (PLEG): container finished" podID="920be425-9042-4830-8468-6dd624a20d43" containerID="f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed" exitCode=0 Nov 29 02:47:58 crc kubenswrapper[4749]: I1129 02:47:58.259076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" event={"ID":"920be425-9042-4830-8468-6dd624a20d43","Type":"ContainerDied","Data":"f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed"} Nov 29 02:47:58 crc kubenswrapper[4749]: I1129 02:47:58.259102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" event={"ID":"920be425-9042-4830-8468-6dd624a20d43","Type":"ContainerStarted","Data":"49b8c7db550ab7b46f364cc04956fd51686affc7fa1f586b44149ad9db7b076d"} Nov 29 02:47:58 crc kubenswrapper[4749]: I1129 02:47:58.262414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aea6d65f-84ae-4ea9-8970-41d287e2f672","Type":"ContainerStarted","Data":"95d48b23ad7301b8276dd6d0c9caa3fe2c8d375084014a3998d508e58d72f8a8"} Nov 29 02:47:59 crc kubenswrapper[4749]: I1129 02:47:59.277004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aea6d65f-84ae-4ea9-8970-41d287e2f672","Type":"ContainerStarted","Data":"d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057"} Nov 29 02:47:59 crc kubenswrapper[4749]: I1129 02:47:59.277588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 29 02:47:59 crc kubenswrapper[4749]: I1129 02:47:59.277618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aea6d65f-84ae-4ea9-8970-41d287e2f672","Type":"ContainerStarted","Data":"7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c"} Nov 29 02:47:59 crc kubenswrapper[4749]: I1129 02:47:59.287719 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" event={"ID":"920be425-9042-4830-8468-6dd624a20d43","Type":"ContainerStarted","Data":"3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8"} Nov 29 02:47:59 crc kubenswrapper[4749]: I1129 02:47:59.287895 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:47:59 crc kubenswrapper[4749]: I1129 02:47:59.315944 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.315923611 podStartE2EDuration="3.315923611s" podCreationTimestamp="2025-11-29 02:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:59.29780293 +0000 UTC m=+5822.469952817" watchObservedRunningTime="2025-11-29 02:47:59.315923611 +0000 UTC m=+5822.488073478" Nov 29 02:47:59 crc kubenswrapper[4749]: I1129 02:47:59.324130 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" podStartSLOduration=3.32411665 podStartE2EDuration="3.32411665s" podCreationTimestamp="2025-11-29 02:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:47:59.314439445 +0000 UTC m=+5822.486589312" watchObservedRunningTime="2025-11-29 02:47:59.32411665 +0000 UTC m=+5822.496266517" Nov 29 02:48:06 crc kubenswrapper[4749]: I1129 02:48:06.990637 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.117408 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b44557d5-4jttm"] Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.117683 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" podUID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" containerName="dnsmasq-dns" containerID="cri-o://94ba86033c46619795d94b063459b0bbc363ecdfaf469fc43b289b32e25721aa" gracePeriod=10 Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.403998 4749 generic.go:334] "Generic (PLEG): container finished" podID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" containerID="94ba86033c46619795d94b063459b0bbc363ecdfaf469fc43b289b32e25721aa" exitCode=0 Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.404130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" event={"ID":"007bfd49-61bd-4de4-a8fc-f1fefafa5b15","Type":"ContainerDied","Data":"94ba86033c46619795d94b063459b0bbc363ecdfaf469fc43b289b32e25721aa"} Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.636645 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.648152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-dns-svc\") pod \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.648228 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-sb\") pod \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.648282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2x87\" (UniqueName: \"kubernetes.io/projected/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-kube-api-access-d2x87\") pod \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.648340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-nb\") pod \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.648361 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-config\") pod \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\" (UID: \"007bfd49-61bd-4de4-a8fc-f1fefafa5b15\") " Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.687483 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-kube-api-access-d2x87" (OuterVolumeSpecName: "kube-api-access-d2x87") pod "007bfd49-61bd-4de4-a8fc-f1fefafa5b15" (UID: "007bfd49-61bd-4de4-a8fc-f1fefafa5b15"). InnerVolumeSpecName "kube-api-access-d2x87". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.752427 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2x87\" (UniqueName: \"kubernetes.io/projected/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-kube-api-access-d2x87\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.767729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "007bfd49-61bd-4de4-a8fc-f1fefafa5b15" (UID: "007bfd49-61bd-4de4-a8fc-f1fefafa5b15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.768087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "007bfd49-61bd-4de4-a8fc-f1fefafa5b15" (UID: "007bfd49-61bd-4de4-a8fc-f1fefafa5b15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.768446 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-config" (OuterVolumeSpecName: "config") pod "007bfd49-61bd-4de4-a8fc-f1fefafa5b15" (UID: "007bfd49-61bd-4de4-a8fc-f1fefafa5b15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.793813 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "007bfd49-61bd-4de4-a8fc-f1fefafa5b15" (UID: "007bfd49-61bd-4de4-a8fc-f1fefafa5b15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.854813 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.854848 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.854865 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:07 crc kubenswrapper[4749]: I1129 02:48:07.854875 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/007bfd49-61bd-4de4-a8fc-f1fefafa5b15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.421138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" event={"ID":"007bfd49-61bd-4de4-a8fc-f1fefafa5b15","Type":"ContainerDied","Data":"574b18098f1d23338baf5b9fe471a2064cbe27c9aa3590c4fada5f4b12cdebdb"} Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.421209 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b44557d5-4jttm" Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.421466 4749 scope.go:117] "RemoveContainer" containerID="94ba86033c46619795d94b063459b0bbc363ecdfaf469fc43b289b32e25721aa" Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.449230 4749 scope.go:117] "RemoveContainer" containerID="021d9a17c74428f8f0fab71f44692d1ed82e733f5c3aca0a6bf7d38c7b7c8aea" Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.460494 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b44557d5-4jttm"] Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.469690 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86b44557d5-4jttm"] Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.541334 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.541576 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-log" containerID="cri-o://4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659" gracePeriod=30 Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.541648 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-api" containerID="cri-o://48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7" gracePeriod=30 Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.558750 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.559026 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-log" containerID="cri-o://2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3" gracePeriod=30 Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.559115 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-metadata" containerID="cri-o://52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035" gracePeriod=30 Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.568250 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.568451 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d" gracePeriod=30 Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.585538 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.585771 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b48ace4a6cdadc6166c13d965648a3a6d29494cc118013971549e3d2de5decb4" gracePeriod=30 Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.593165 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:48:08 crc kubenswrapper[4749]: I1129 02:48:08.593421 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="820581a7-a881-4272-b198-b25ad45892dd" containerName="nova-scheduler-scheduler" containerID="cri-o://8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9" gracePeriod=30 Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.085904 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" path="/var/lib/kubelet/pods/007bfd49-61bd-4de4-a8fc-f1fefafa5b15/volumes" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.142710 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.450479 4749 generic.go:334] "Generic (PLEG): container finished" podID="4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" containerID="b48ace4a6cdadc6166c13d965648a3a6d29494cc118013971549e3d2de5decb4" exitCode=0 Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.450546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e","Type":"ContainerDied","Data":"b48ace4a6cdadc6166c13d965648a3a6d29494cc118013971549e3d2de5decb4"} Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.450573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e","Type":"ContainerDied","Data":"20f21579316b6259bdf86ebadbf209b4f22e9ca48fd9a7fb21d3fc567f4a4da2"} Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.450583 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20f21579316b6259bdf86ebadbf209b4f22e9ca48fd9a7fb21d3fc567f4a4da2" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.450671 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.477429 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerID="4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659" exitCode=143 Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.477509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1","Type":"ContainerDied","Data":"4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659"} Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.495008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-config-data\") pod \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.495137 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-combined-ca-bundle\") pod \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.495277 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmhbq\" (UniqueName: \"kubernetes.io/projected/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-kube-api-access-pmhbq\") pod \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\" (UID: \"4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e\") " Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.534421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-kube-api-access-pmhbq" (OuterVolumeSpecName: "kube-api-access-pmhbq") pod "4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" (UID: "4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e"). InnerVolumeSpecName "kube-api-access-pmhbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.534661 4749 generic.go:334] "Generic (PLEG): container finished" podID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerID="2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3" exitCode=143 Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.534744 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39110a88-02a9-4031-a747-6c83d4e69aa4","Type":"ContainerDied","Data":"2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3"} Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.569364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" (UID: "4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.578001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-config-data" (OuterVolumeSpecName: "config-data") pod "4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" (UID: "4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.597738 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmhbq\" (UniqueName: \"kubernetes.io/projected/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-kube-api-access-pmhbq\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.597780 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:09 crc kubenswrapper[4749]: I1129 02:48:09.597794 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.263650 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.309837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-config-data\") pod \"820581a7-a881-4272-b198-b25ad45892dd\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.309889 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c6kj\" (UniqueName: \"kubernetes.io/projected/820581a7-a881-4272-b198-b25ad45892dd-kube-api-access-7c6kj\") pod \"820581a7-a881-4272-b198-b25ad45892dd\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.310002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-combined-ca-bundle\") pod \"820581a7-a881-4272-b198-b25ad45892dd\" (UID: \"820581a7-a881-4272-b198-b25ad45892dd\") " Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.316570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820581a7-a881-4272-b198-b25ad45892dd-kube-api-access-7c6kj" (OuterVolumeSpecName: "kube-api-access-7c6kj") pod "820581a7-a881-4272-b198-b25ad45892dd" (UID: "820581a7-a881-4272-b198-b25ad45892dd"). InnerVolumeSpecName "kube-api-access-7c6kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.339144 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-config-data" (OuterVolumeSpecName: "config-data") pod "820581a7-a881-4272-b198-b25ad45892dd" (UID: "820581a7-a881-4272-b198-b25ad45892dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.340524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "820581a7-a881-4272-b198-b25ad45892dd" (UID: "820581a7-a881-4272-b198-b25ad45892dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.427492 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.427548 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c6kj\" (UniqueName: \"kubernetes.io/projected/820581a7-a881-4272-b198-b25ad45892dd-kube-api-access-7c6kj\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.427559 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/820581a7-a881-4272-b198-b25ad45892dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.557269 4749 generic.go:334] "Generic (PLEG): container finished" podID="820581a7-a881-4272-b198-b25ad45892dd" containerID="8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9" exitCode=0 Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.557322 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.557537 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.557331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"820581a7-a881-4272-b198-b25ad45892dd","Type":"ContainerDied","Data":"8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9"} Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.557625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"820581a7-a881-4272-b198-b25ad45892dd","Type":"ContainerDied","Data":"0a090605667210cf9cb84557714a2947352097440b59210e6dce5f812da61f5f"} Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.557660 4749 scope.go:117] "RemoveContainer" containerID="8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.643540 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.673791 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.698425 4749 scope.go:117] "RemoveContainer" containerID="8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.698572 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:48:10 crc kubenswrapper[4749]: E1129 02:48:10.703014 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9\": container with ID starting with 8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9 not found: ID does not exist" containerID="8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.703070 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9"} err="failed to get container status \"8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9\": rpc error: code = NotFound desc = could not find container \"8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9\": container with ID starting with 8ea4ea44acae2b24993c0a0e061f0be0b5f061e1674a4da40368ffbef138eda9 not found: ID does not exist" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.719400 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.727561 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:48:10 crc kubenswrapper[4749]: E1129 02:48:10.728054 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" containerName="dnsmasq-dns" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.728120 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" containerName="dnsmasq-dns" Nov 29 02:48:10 crc kubenswrapper[4749]: E1129 02:48:10.728189 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820581a7-a881-4272-b198-b25ad45892dd" containerName="nova-scheduler-scheduler" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.728259 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="820581a7-a881-4272-b198-b25ad45892dd" containerName="nova-scheduler-scheduler" Nov 29 02:48:10 crc kubenswrapper[4749]: E1129 02:48:10.728313 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" containerName="init" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.728364 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" containerName="init" Nov 29 02:48:10 crc kubenswrapper[4749]: E1129 02:48:10.728432 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.728491 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.728794 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="007bfd49-61bd-4de4-a8fc-f1fefafa5b15" containerName="dnsmasq-dns" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.728862 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="820581a7-a881-4272-b198-b25ad45892dd" containerName="nova-scheduler-scheduler" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.728927 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" containerName="nova-cell1-novncproxy-novncproxy" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.729592 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.731290 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.738732 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.747599 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.748680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.750536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.756653 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.841048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc8df\" (UniqueName: \"kubernetes.io/projected/cbf6f457-e2a8-4503-8120-81ef8237ef59-kube-api-access-vc8df\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.841104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf6f457-e2a8-4503-8120-81ef8237ef59-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.841142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqs4n\" (UniqueName: \"kubernetes.io/projected/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-kube-api-access-lqs4n\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.841283 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-config-data\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.841500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf6f457-e2a8-4503-8120-81ef8237ef59-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.841563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.943784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf6f457-e2a8-4503-8120-81ef8237ef59-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.943844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.943907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc8df\" (UniqueName: \"kubernetes.io/projected/cbf6f457-e2a8-4503-8120-81ef8237ef59-kube-api-access-vc8df\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.943927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf6f457-e2a8-4503-8120-81ef8237ef59-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.943953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqs4n\" (UniqueName: \"kubernetes.io/projected/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-kube-api-access-lqs4n\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.943988 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-config-data\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.950135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-config-data\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.954313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbf6f457-e2a8-4503-8120-81ef8237ef59-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.954858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbf6f457-e2a8-4503-8120-81ef8237ef59-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.961224 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.962344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc8df\" (UniqueName: \"kubernetes.io/projected/cbf6f457-e2a8-4503-8120-81ef8237ef59-kube-api-access-vc8df\") pod \"nova-cell1-novncproxy-0\" (UID: \"cbf6f457-e2a8-4503-8120-81ef8237ef59\") " pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:10 crc kubenswrapper[4749]: I1129 02:48:10.971050 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqs4n\" (UniqueName: \"kubernetes.io/projected/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-kube-api-access-lqs4n\") pod \"nova-scheduler-0\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " pod="openstack/nova-scheduler-0" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.059568 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.068028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.090941 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e" path="/var/lib/kubelet/pods/4dd1cf26-1b5c-4f49-9790-f87d30cf5c4e/volumes" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.091870 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820581a7-a881-4272-b198-b25ad45892dd" path="/var/lib/kubelet/pods/820581a7-a881-4272-b198-b25ad45892dd/volumes" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.560162 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.629561 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 02:48:11 crc kubenswrapper[4749]: W1129 02:48:11.651298 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e5f3f01_7951_4368_ac89_9e98a03dd5b3.slice/crio-a0cbbee9627b1e1039a4353476fd279a19d26496a2c0b42ebf12a2afadb5d436 WatchSource:0}: Error finding container a0cbbee9627b1e1039a4353476fd279a19d26496a2c0b42ebf12a2afadb5d436: Status 404 returned error can't find the container with id a0cbbee9627b1e1039a4353476fd279a19d26496a2c0b42ebf12a2afadb5d436 Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.709846 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": read tcp 10.217.0.2:48542->10.217.1.71:8774: read: connection reset by peer" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.710519 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": read tcp 10.217.0.2:48540->10.217.1.71:8774: read: connection reset by peer" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.723317 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": read tcp 10.217.0.2:43334->10.217.1.70:8775: read: connection reset by peer" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.723317 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.70:8775/\": read tcp 10.217.0.2:43344->10.217.1.70:8775: read: connection reset by peer" Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.781190 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:48:11 crc kubenswrapper[4749]: I1129 02:48:11.781488 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="7b019e06-5331-4fc8-b736-dd6e5670c45c" containerName="nova-cell1-conductor-conductor" containerID="cri-o://16a5834f7550b4b2fae489791dea50dada2a6621a59520c5a9b1a9075200778a" gracePeriod=30 Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.152210 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.165813 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-config-data\") pod \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.165845 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-logs\") pod \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.166029 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-combined-ca-bundle\") pod \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.166125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfrxf\" (UniqueName: \"kubernetes.io/projected/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-kube-api-access-qfrxf\") pod \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\" (UID: \"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.166677 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-logs" (OuterVolumeSpecName: "logs") pod "ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" (UID: "ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.168710 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.193485 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-kube-api-access-qfrxf" (OuterVolumeSpecName: "kube-api-access-qfrxf") pod "ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" (UID: "ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1"). InnerVolumeSpecName "kube-api-access-qfrxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.193737 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.230494 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-config-data" (OuterVolumeSpecName: "config-data") pod "ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" (UID: "ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.266781 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" (UID: "ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.275350 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tkhd\" (UniqueName: \"kubernetes.io/projected/39110a88-02a9-4031-a747-6c83d4e69aa4-kube-api-access-4tkhd\") pod \"39110a88-02a9-4031-a747-6c83d4e69aa4\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.275440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39110a88-02a9-4031-a747-6c83d4e69aa4-logs\") pod \"39110a88-02a9-4031-a747-6c83d4e69aa4\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.275597 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-config-data\") pod \"39110a88-02a9-4031-a747-6c83d4e69aa4\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.275763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-combined-ca-bundle\") pod \"39110a88-02a9-4031-a747-6c83d4e69aa4\" (UID: \"39110a88-02a9-4031-a747-6c83d4e69aa4\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.276107 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfrxf\" (UniqueName: \"kubernetes.io/projected/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-kube-api-access-qfrxf\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.276138 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.276148 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.276982 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39110a88-02a9-4031-a747-6c83d4e69aa4-logs" (OuterVolumeSpecName: "logs") pod "39110a88-02a9-4031-a747-6c83d4e69aa4" (UID: "39110a88-02a9-4031-a747-6c83d4e69aa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.279730 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39110a88-02a9-4031-a747-6c83d4e69aa4-kube-api-access-4tkhd" (OuterVolumeSpecName: "kube-api-access-4tkhd") pod "39110a88-02a9-4031-a747-6c83d4e69aa4" (UID: "39110a88-02a9-4031-a747-6c83d4e69aa4"). InnerVolumeSpecName "kube-api-access-4tkhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.301330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-config-data" (OuterVolumeSpecName: "config-data") pod "39110a88-02a9-4031-a747-6c83d4e69aa4" (UID: "39110a88-02a9-4031-a747-6c83d4e69aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.318022 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39110a88-02a9-4031-a747-6c83d4e69aa4" (UID: "39110a88-02a9-4031-a747-6c83d4e69aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.353835 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.379583 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-config-data\") pod \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.379723 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnlxn\" (UniqueName: \"kubernetes.io/projected/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-kube-api-access-jnlxn\") pod \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.379807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-combined-ca-bundle\") pod \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\" (UID: \"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b\") " Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.380345 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.380358 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tkhd\" (UniqueName: \"kubernetes.io/projected/39110a88-02a9-4031-a747-6c83d4e69aa4-kube-api-access-4tkhd\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.380368 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/39110a88-02a9-4031-a747-6c83d4e69aa4-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.380376 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39110a88-02a9-4031-a747-6c83d4e69aa4-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.399139 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-kube-api-access-jnlxn" (OuterVolumeSpecName: "kube-api-access-jnlxn") pod "24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" (UID: "24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b"). InnerVolumeSpecName "kube-api-access-jnlxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.407942 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-config-data" (OuterVolumeSpecName: "config-data") pod "24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" (UID: "24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.412962 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" (UID: "24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.482387 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.482418 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.482428 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnlxn\" (UniqueName: \"kubernetes.io/projected/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b-kube-api-access-jnlxn\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.586063 4749 generic.go:334] "Generic (PLEG): container finished" podID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerID="48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7" exitCode=0 Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.586153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1","Type":"ContainerDied","Data":"48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.586192 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1","Type":"ContainerDied","Data":"29ddd33d3f34ed4fd370ef538bcc1293bbfee81e9b490fd8e2be35345b9e1307"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.586247 4749 scope.go:117] "RemoveContainer" containerID="48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.586378 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.605443 4749 generic.go:334] "Generic (PLEG): container finished" podID="24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" containerID="6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d" exitCode=0 Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.605535 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.605541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b","Type":"ContainerDied","Data":"6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.605690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b","Type":"ContainerDied","Data":"e8c9f3e5795767ff07d20f6174545d02aa84f5ca9d14ea5c2eeb32dd3460b547"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.608236 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e5f3f01-7951-4368-ac89-9e98a03dd5b3","Type":"ContainerStarted","Data":"463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.608357 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e5f3f01-7951-4368-ac89-9e98a03dd5b3","Type":"ContainerStarted","Data":"a0cbbee9627b1e1039a4353476fd279a19d26496a2c0b42ebf12a2afadb5d436"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.610860 4749 generic.go:334] "Generic (PLEG): container finished" podID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerID="52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035" exitCode=0 Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.610965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39110a88-02a9-4031-a747-6c83d4e69aa4","Type":"ContainerDied","Data":"52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.611035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"39110a88-02a9-4031-a747-6c83d4e69aa4","Type":"ContainerDied","Data":"a9e4a9e9471c925a6916a9b55cf799ac4d6bb6933738a41b0c7600c9ac5b7e9c"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.611138 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.647273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cbf6f457-e2a8-4503-8120-81ef8237ef59","Type":"ContainerStarted","Data":"e033bc4843090365d0d109e3705d4c916b99fbbfc56c4efcf485acfea9be6000"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.647512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cbf6f457-e2a8-4503-8120-81ef8237ef59","Type":"ContainerStarted","Data":"e8509617e562dbec4cd30fdb5dcb0c0006da94fc12514acbcd37d4769d34732f"} Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.665467 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.665450223 podStartE2EDuration="2.665450223s" podCreationTimestamp="2025-11-29 02:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:12.642131786 +0000 UTC m=+5835.814281643" watchObservedRunningTime="2025-11-29 02:48:12.665450223 +0000 UTC m=+5835.837600080" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.703907 4749 scope.go:117] "RemoveContainer" containerID="4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.742865 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.742844544 podStartE2EDuration="2.742844544s" podCreationTimestamp="2025-11-29 02:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:12.679670799 +0000 UTC m=+5835.851820666" watchObservedRunningTime="2025-11-29 02:48:12.742844544 +0000 UTC m=+5835.914994401" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.743359 4749 scope.go:117] "RemoveContainer" containerID="48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7" Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.748329 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7\": container with ID starting with 48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7 not found: ID does not exist" containerID="48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.748389 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7"} err="failed to get container status \"48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7\": rpc error: code = NotFound desc = could not find container \"48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7\": container with ID starting with 48c7bb676e528041b9739b47cff4138fb50b9d7a0711a8bb77700b389e3e47e7 not found: ID does not exist" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.748415 4749 scope.go:117] "RemoveContainer" containerID="4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659" Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.754476 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659\": container with ID starting with 4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659 not found: ID does not exist" containerID="4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.754517 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659"} err="failed to get container status \"4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659\": rpc error: code = NotFound desc = could not find container \"4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659\": container with ID starting with 4b92fa32dca27b23c6f5252fef736b4af9bffb577729ece1ed5478089fc54659 not found: ID does not exist" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.754542 4749 scope.go:117] "RemoveContainer" containerID="6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.767276 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.778057 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.792980 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.797252 4749 scope.go:117] "RemoveContainer" containerID="6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.802235 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.813385 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d\": container with ID starting with 6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d not found: ID does not exist" containerID="6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.813463 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d"} err="failed to get container status \"6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d\": rpc error: code = NotFound desc = could not find container \"6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d\": container with ID starting with 6f434d4ac3b059d4f8d7ed86395b5d94e45cb25e75b504f6c3717ea659801d4d not found: ID does not exist" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.813493 4749 scope.go:117] "RemoveContainer" containerID="52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.816811 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.835654 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.867262 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.868702 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-log" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.868784 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-log" Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.868919 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" containerName="nova-cell0-conductor-conductor" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.869019 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" containerName="nova-cell0-conductor-conductor" Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.869126 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-log" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.869230 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-log" Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.869322 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-metadata" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.869409 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-metadata" Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.869499 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-api" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.869565 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-api" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.869939 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-log" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.870039 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-metadata" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.870116 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" containerName="nova-cell0-conductor-conductor" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.870173 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" containerName="nova-metadata-log" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.870265 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" containerName="nova-api-api" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.871812 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.878998 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.879581 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.885447 4749 scope.go:117] "RemoveContainer" containerID="2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.896172 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx9hl\" (UniqueName: \"kubernetes.io/projected/06c8327c-c860-4340-87b5-bc2a939d986a-kube-api-access-hx9hl\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.896409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06c8327c-c860-4340-87b5-bc2a939d986a-logs\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.906454 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-config-data\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.906576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.942622 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.943759 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.945468 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.965714 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.968661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.971860 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.972521 4749 scope.go:117] "RemoveContainer" containerID="52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035" Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.972860 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035\": container with ID starting with 52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035 not found: ID does not exist" containerID="52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.972893 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035"} err="failed to get container status \"52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035\": rpc error: code = NotFound desc = could not find container \"52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035\": container with ID starting with 52dc06bc04f6b2df7947b774d93bd3b948cc8d4d8e454ac30a824f9279134035 not found: ID does not exist" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.972910 4749 scope.go:117] "RemoveContainer" containerID="2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3" Nov 29 02:48:12 crc kubenswrapper[4749]: E1129 02:48:12.973242 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3\": container with ID starting with 2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3 not found: ID does not exist" containerID="2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.973311 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3"} err="failed to get container status \"2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3\": rpc error: code = NotFound desc = could not find container \"2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3\": container with ID starting with 2a5407719518ca5963f7c5294096baf347cd7dd2578726f4de54c77f4e7707a3 not found: ID does not exist" Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.980401 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:48:12 crc kubenswrapper[4749]: I1129 02:48:12.989460 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfkz9\" (UniqueName: \"kubernetes.io/projected/74d94086-38fb-4dfa-8cb1-09f2ca302406-kube-api-access-dfkz9\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008547 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456ec409-a59d-4080-ac18-96207a4138a6-logs\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-config-data\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx9hl\" (UniqueName: \"kubernetes.io/projected/06c8327c-c860-4340-87b5-bc2a939d986a-kube-api-access-hx9hl\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06c8327c-c860-4340-87b5-bc2a939d986a-logs\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-config-data\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjws\" (UniqueName: \"kubernetes.io/projected/456ec409-a59d-4080-ac18-96207a4138a6-kube-api-access-lqjws\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.008783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.009462 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06c8327c-c860-4340-87b5-bc2a939d986a-logs\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.013725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-config-data\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.015320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.023182 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx9hl\" (UniqueName: \"kubernetes.io/projected/06c8327c-c860-4340-87b5-bc2a939d986a-kube-api-access-hx9hl\") pod \"nova-api-0\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.077354 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:48:13 crc kubenswrapper[4749]: E1129 02:48:13.077610 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.086640 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b" path="/var/lib/kubelet/pods/24e5f0f3-e60f-4a5f-b55b-be7fa907ca1b/volumes" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.087269 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39110a88-02a9-4031-a747-6c83d4e69aa4" path="/var/lib/kubelet/pods/39110a88-02a9-4031-a747-6c83d4e69aa4/volumes" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.087831 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1" path="/var/lib/kubelet/pods/ffd077c8-32f6-438d-8cf6-5e9a2d7fbbe1/volumes" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.109826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.110127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfkz9\" (UniqueName: \"kubernetes.io/projected/74d94086-38fb-4dfa-8cb1-09f2ca302406-kube-api-access-dfkz9\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.110237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456ec409-a59d-4080-ac18-96207a4138a6-logs\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.110308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-config-data\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.110407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.110482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.110693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqjws\" (UniqueName: \"kubernetes.io/projected/456ec409-a59d-4080-ac18-96207a4138a6-kube-api-access-lqjws\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.113969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.117270 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456ec409-a59d-4080-ac18-96207a4138a6-logs\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.120029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-config-data\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.121205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.121349 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.138826 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfkz9\" (UniqueName: \"kubernetes.io/projected/74d94086-38fb-4dfa-8cb1-09f2ca302406-kube-api-access-dfkz9\") pod \"nova-cell0-conductor-0\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.140144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqjws\" (UniqueName: \"kubernetes.io/projected/456ec409-a59d-4080-ac18-96207a4138a6-kube-api-access-lqjws\") pod \"nova-metadata-0\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.243356 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.277505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.298724 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.804952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.884843 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 02:48:13 crc kubenswrapper[4749]: I1129 02:48:13.963812 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 02:48:13 crc kubenswrapper[4749]: W1129 02:48:13.968857 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74d94086_38fb_4dfa_8cb1_09f2ca302406.slice/crio-1b120063e6e64087c5cd8e124e6308c915122ef3d85daa6570286150913e5abd WatchSource:0}: Error finding container 1b120063e6e64087c5cd8e124e6308c915122ef3d85daa6570286150913e5abd: Status 404 returned error can't find the container with id 1b120063e6e64087c5cd8e124e6308c915122ef3d85daa6570286150913e5abd Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.680319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"74d94086-38fb-4dfa-8cb1-09f2ca302406","Type":"ContainerStarted","Data":"378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106"} Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.680652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"74d94086-38fb-4dfa-8cb1-09f2ca302406","Type":"ContainerStarted","Data":"1b120063e6e64087c5cd8e124e6308c915122ef3d85daa6570286150913e5abd"} Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.680693 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.682422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"06c8327c-c860-4340-87b5-bc2a939d986a","Type":"ContainerStarted","Data":"e254aab0196ba3756f1af170fc7a3657ba70c5bc2ce0d97642e1beca12843cc9"} Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.682445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"06c8327c-c860-4340-87b5-bc2a939d986a","Type":"ContainerStarted","Data":"e6bbd4ce061db29771862d9d9e2d044ae3b6847acc9c278038db93b75d7c1598"} Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.682456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"06c8327c-c860-4340-87b5-bc2a939d986a","Type":"ContainerStarted","Data":"a931130275e40f02afd9bbd08d46fd1060a3a4d349b19d7f6bd6e07f99df4612"} Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.684015 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456ec409-a59d-4080-ac18-96207a4138a6","Type":"ContainerStarted","Data":"844b63d7ef81b44bf6d99aac2f89407a540f9f2ec1504d239bcc104aa536427d"} Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.684040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456ec409-a59d-4080-ac18-96207a4138a6","Type":"ContainerStarted","Data":"cf7d7fc251dc251e8dadd04344a6d436cee05a333ba8ea5cb11dd5df1561aed9"} Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.684051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456ec409-a59d-4080-ac18-96207a4138a6","Type":"ContainerStarted","Data":"55ab4cbb11dc82de3a38da352b98dfc4bafa1c3eb092dccc874476d811c00176"} Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.701666 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.701645023 podStartE2EDuration="2.701645023s" podCreationTimestamp="2025-11-29 02:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:14.699596623 +0000 UTC m=+5837.871746480" watchObservedRunningTime="2025-11-29 02:48:14.701645023 +0000 UTC m=+5837.873794890" Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.744132 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.744106505 podStartE2EDuration="2.744106505s" podCreationTimestamp="2025-11-29 02:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:14.723846542 +0000 UTC m=+5837.895996419" watchObservedRunningTime="2025-11-29 02:48:14.744106505 +0000 UTC m=+5837.916256362" Nov 29 02:48:14 crc kubenswrapper[4749]: I1129 02:48:14.772257 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.772229798 podStartE2EDuration="2.772229798s" podCreationTimestamp="2025-11-29 02:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:14.747659131 +0000 UTC m=+5837.919809038" watchObservedRunningTime="2025-11-29 02:48:14.772229798 +0000 UTC m=+5837.944379705" Nov 29 02:48:15 crc kubenswrapper[4749]: I1129 02:48:15.711729 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b019e06-5331-4fc8-b736-dd6e5670c45c" containerID="16a5834f7550b4b2fae489791dea50dada2a6621a59520c5a9b1a9075200778a" exitCode=0 Nov 29 02:48:15 crc kubenswrapper[4749]: I1129 02:48:15.711855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7b019e06-5331-4fc8-b736-dd6e5670c45c","Type":"ContainerDied","Data":"16a5834f7550b4b2fae489791dea50dada2a6621a59520c5a9b1a9075200778a"} Nov 29 02:48:15 crc kubenswrapper[4749]: I1129 02:48:15.884537 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:15 crc kubenswrapper[4749]: I1129 02:48:15.971479 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-combined-ca-bundle\") pod \"7b019e06-5331-4fc8-b736-dd6e5670c45c\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " Nov 29 02:48:15 crc kubenswrapper[4749]: I1129 02:48:15.971650 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drgk7\" (UniqueName: \"kubernetes.io/projected/7b019e06-5331-4fc8-b736-dd6e5670c45c-kube-api-access-drgk7\") pod \"7b019e06-5331-4fc8-b736-dd6e5670c45c\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " Nov 29 02:48:15 crc kubenswrapper[4749]: I1129 02:48:15.971720 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-config-data\") pod \"7b019e06-5331-4fc8-b736-dd6e5670c45c\" (UID: \"7b019e06-5331-4fc8-b736-dd6e5670c45c\") " Nov 29 02:48:15 crc kubenswrapper[4749]: I1129 02:48:15.978073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b019e06-5331-4fc8-b736-dd6e5670c45c-kube-api-access-drgk7" (OuterVolumeSpecName: "kube-api-access-drgk7") pod "7b019e06-5331-4fc8-b736-dd6e5670c45c" (UID: "7b019e06-5331-4fc8-b736-dd6e5670c45c"). InnerVolumeSpecName "kube-api-access-drgk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:15 crc kubenswrapper[4749]: I1129 02:48:15.999717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-config-data" (OuterVolumeSpecName: "config-data") pod "7b019e06-5331-4fc8-b736-dd6e5670c45c" (UID: "7b019e06-5331-4fc8-b736-dd6e5670c45c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.010523 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b019e06-5331-4fc8-b736-dd6e5670c45c" (UID: "7b019e06-5331-4fc8-b736-dd6e5670c45c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.060368 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.068663 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.074989 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.075020 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drgk7\" (UniqueName: \"kubernetes.io/projected/7b019e06-5331-4fc8-b736-dd6e5670c45c-kube-api-access-drgk7\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.075212 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b019e06-5331-4fc8-b736-dd6e5670c45c-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.735763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7b019e06-5331-4fc8-b736-dd6e5670c45c","Type":"ContainerDied","Data":"db9270c6dfe098126977ffe5d502d3884a9515546ecbd1a33368065d30652c0e"} Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.735813 4749 scope.go:117] "RemoveContainer" containerID="16a5834f7550b4b2fae489791dea50dada2a6621a59520c5a9b1a9075200778a" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.735937 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.796912 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.808256 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.819248 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:48:16 crc kubenswrapper[4749]: E1129 02:48:16.819733 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b019e06-5331-4fc8-b736-dd6e5670c45c" containerName="nova-cell1-conductor-conductor" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.819753 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b019e06-5331-4fc8-b736-dd6e5670c45c" containerName="nova-cell1-conductor-conductor" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.819978 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b019e06-5331-4fc8-b736-dd6e5670c45c" containerName="nova-cell1-conductor-conductor" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.820761 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.824909 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.830841 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.891873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pxm\" (UniqueName: \"kubernetes.io/projected/78f71145-5cfd-4d13-a7c5-37301910d02b-kube-api-access-l7pxm\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.891950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.892033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.993991 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pxm\" (UniqueName: \"kubernetes.io/projected/78f71145-5cfd-4d13-a7c5-37301910d02b-kube-api-access-l7pxm\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.994212 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.994304 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:16 crc kubenswrapper[4749]: I1129 02:48:16.999704 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:17 crc kubenswrapper[4749]: I1129 02:48:17.000240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:17 crc kubenswrapper[4749]: I1129 02:48:17.012256 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pxm\" (UniqueName: \"kubernetes.io/projected/78f71145-5cfd-4d13-a7c5-37301910d02b-kube-api-access-l7pxm\") pod \"nova-cell1-conductor-0\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:17 crc kubenswrapper[4749]: I1129 02:48:17.088776 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b019e06-5331-4fc8-b736-dd6e5670c45c" path="/var/lib/kubelet/pods/7b019e06-5331-4fc8-b736-dd6e5670c45c/volumes" Nov 29 02:48:17 crc kubenswrapper[4749]: I1129 02:48:17.149796 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:17 crc kubenswrapper[4749]: I1129 02:48:17.630130 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 02:48:17 crc kubenswrapper[4749]: W1129 02:48:17.635648 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78f71145_5cfd_4d13_a7c5_37301910d02b.slice/crio-635158f8754c8ec2e43d7775d34b28b99446a46593795f03bfb5c84da023b568 WatchSource:0}: Error finding container 635158f8754c8ec2e43d7775d34b28b99446a46593795f03bfb5c84da023b568: Status 404 returned error can't find the container with id 635158f8754c8ec2e43d7775d34b28b99446a46593795f03bfb5c84da023b568 Nov 29 02:48:17 crc kubenswrapper[4749]: I1129 02:48:17.749841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"78f71145-5cfd-4d13-a7c5-37301910d02b","Type":"ContainerStarted","Data":"635158f8754c8ec2e43d7775d34b28b99446a46593795f03bfb5c84da023b568"} Nov 29 02:48:18 crc kubenswrapper[4749]: I1129 02:48:18.300141 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 02:48:18 crc kubenswrapper[4749]: I1129 02:48:18.300508 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 02:48:18 crc kubenswrapper[4749]: I1129 02:48:18.762511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"78f71145-5cfd-4d13-a7c5-37301910d02b","Type":"ContainerStarted","Data":"373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc"} Nov 29 02:48:18 crc kubenswrapper[4749]: I1129 02:48:18.762658 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:18 crc kubenswrapper[4749]: I1129 02:48:18.783105 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.783091343 podStartE2EDuration="2.783091343s" podCreationTimestamp="2025-11-29 02:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:18.779289991 +0000 UTC m=+5841.951439878" watchObservedRunningTime="2025-11-29 02:48:18.783091343 +0000 UTC m=+5841.955241190" Nov 29 02:48:21 crc kubenswrapper[4749]: I1129 02:48:21.059881 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:21 crc kubenswrapper[4749]: I1129 02:48:21.068873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 02:48:21 crc kubenswrapper[4749]: I1129 02:48:21.098170 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:21 crc kubenswrapper[4749]: I1129 02:48:21.130917 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 02:48:21 crc kubenswrapper[4749]: I1129 02:48:21.827862 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 29 02:48:21 crc kubenswrapper[4749]: I1129 02:48:21.884836 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 02:48:22 crc kubenswrapper[4749]: I1129 02:48:22.197656 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 29 02:48:23 crc kubenswrapper[4749]: I1129 02:48:23.244692 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 02:48:23 crc kubenswrapper[4749]: I1129 02:48:23.245697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 02:48:23 crc kubenswrapper[4749]: I1129 02:48:23.302306 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 02:48:23 crc kubenswrapper[4749]: I1129 02:48:23.302346 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 02:48:23 crc kubenswrapper[4749]: I1129 02:48:23.347768 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 29 02:48:24 crc kubenswrapper[4749]: I1129 02:48:24.328403 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:48:24 crc kubenswrapper[4749]: I1129 02:48:24.328463 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:48:24 crc kubenswrapper[4749]: I1129 02:48:24.410564 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:48:24 crc kubenswrapper[4749]: I1129 02:48:24.410622 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 02:48:26 crc kubenswrapper[4749]: I1129 02:48:26.076326 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:48:26 crc kubenswrapper[4749]: E1129 02:48:26.076938 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.390588 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.392866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.398068 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.415066 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.494255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-scripts\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.494295 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/093d202b-1b95-4ebc-932a-ce6062c3abac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.494428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.494498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nddrp\" (UniqueName: \"kubernetes.io/projected/093d202b-1b95-4ebc-932a-ce6062c3abac-kube-api-access-nddrp\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.494551 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.494735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.596829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.596945 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-scripts\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.596971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/093d202b-1b95-4ebc-932a-ce6062c3abac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.597055 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.597089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nddrp\" (UniqueName: \"kubernetes.io/projected/093d202b-1b95-4ebc-932a-ce6062c3abac-kube-api-access-nddrp\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.597119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.597975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/093d202b-1b95-4ebc-932a-ce6062c3abac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.605654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.605748 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.606489 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-scripts\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.611680 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.631435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nddrp\" (UniqueName: \"kubernetes.io/projected/093d202b-1b95-4ebc-932a-ce6062c3abac-kube-api-access-nddrp\") pod \"cinder-scheduler-0\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:27 crc kubenswrapper[4749]: I1129 02:48:27.757632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 02:48:28 crc kubenswrapper[4749]: I1129 02:48:28.281008 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:28 crc kubenswrapper[4749]: W1129 02:48:28.286053 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod093d202b_1b95_4ebc_932a_ce6062c3abac.slice/crio-e20c29f5a6a6feadf0d96b26ae3b74db4250f03794063138ef1d3dd16cf73695 WatchSource:0}: Error finding container e20c29f5a6a6feadf0d96b26ae3b74db4250f03794063138ef1d3dd16cf73695: Status 404 returned error can't find the container with id e20c29f5a6a6feadf0d96b26ae3b74db4250f03794063138ef1d3dd16cf73695 Nov 29 02:48:28 crc kubenswrapper[4749]: I1129 02:48:28.893164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"093d202b-1b95-4ebc-932a-ce6062c3abac","Type":"ContainerStarted","Data":"e20c29f5a6a6feadf0d96b26ae3b74db4250f03794063138ef1d3dd16cf73695"} Nov 29 02:48:28 crc kubenswrapper[4749]: I1129 02:48:28.953919 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:48:28 crc kubenswrapper[4749]: I1129 02:48:28.954182 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api" containerID="cri-o://d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057" gracePeriod=30 Nov 29 02:48:28 crc kubenswrapper[4749]: I1129 02:48:28.954140 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api-log" containerID="cri-o://7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c" gracePeriod=30 Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.612728 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.614143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.616891 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.627547 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.676569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.676894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677143 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f1bad501-c950-4fad-b698-59f5ad3f3e63-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677216 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fjm\" (UniqueName: \"kubernetes.io/projected/f1bad501-c950-4fad-b698-59f5ad3f3e63-kube-api-access-t5fjm\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677315 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-run\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677392 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.677734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779771 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779822 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f1bad501-c950-4fad-b698-59f5ad3f3e63-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779909 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fjm\" (UniqueName: \"kubernetes.io/projected/f1bad501-c950-4fad-b698-59f5ad3f3e63-kube-api-access-t5fjm\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779945 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.779983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-run\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.780005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.780022 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.780043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.780127 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.780159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-dev\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.780179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.780791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-run\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.781006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.781015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.781054 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.781016 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.781084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.781113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1bad501-c950-4fad-b698-59f5ad3f3e63-sys\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.785558 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f1bad501-c950-4fad-b698-59f5ad3f3e63-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.785563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.787719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.788731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.799420 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1bad501-c950-4fad-b698-59f5ad3f3e63-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.800024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fjm\" (UniqueName: \"kubernetes.io/projected/f1bad501-c950-4fad-b698-59f5ad3f3e63-kube-api-access-t5fjm\") pod \"cinder-volume-volume1-0\" (UID: \"f1bad501-c950-4fad-b698-59f5ad3f3e63\") " pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.904788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"093d202b-1b95-4ebc-932a-ce6062c3abac","Type":"ContainerStarted","Data":"2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66"} Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.904841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"093d202b-1b95-4ebc-932a-ce6062c3abac","Type":"ContainerStarted","Data":"030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb"} Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.907364 4749 generic.go:334] "Generic (PLEG): container finished" podID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerID="7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c" exitCode=143 Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.907400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aea6d65f-84ae-4ea9-8970-41d287e2f672","Type":"ContainerDied","Data":"7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c"} Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.931237 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:29 crc kubenswrapper[4749]: I1129 02:48:29.940164 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.940147358 podStartE2EDuration="2.940147358s" podCreationTimestamp="2025-11-29 02:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:29.932889011 +0000 UTC m=+5853.105038868" watchObservedRunningTime="2025-11-29 02:48:29.940147358 +0000 UTC m=+5853.112297215" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.288488 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.290180 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.293723 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.299185 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.395599 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-sys\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.395654 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-dev\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396332 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvfg\" (UniqueName: \"kubernetes.io/projected/a36dabb0-136d-40c7-b9b1-7174cd3ba355-kube-api-access-lrvfg\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-run\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396447 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-config-data\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396477 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-scripts\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396562 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-lib-modules\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.396640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a36dabb0-136d-40c7-b9b1-7174cd3ba355-ceph\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.464585 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.497779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.497819 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.497841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-scripts\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.497868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.497889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-lib-modules\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.497928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.497950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a36dabb0-136d-40c7-b9b1-7174cd3ba355-ceph\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.497982 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-sys\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498008 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-dev\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvfg\" (UniqueName: \"kubernetes.io/projected/a36dabb0-136d-40c7-b9b1-7174cd3ba355-kube-api-access-lrvfg\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498123 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-run\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498137 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-config-data\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-sys\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498348 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-lib-modules\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-dev\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498603 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-run\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.498687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.499183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a36dabb0-136d-40c7-b9b1-7174cd3ba355-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.503654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-config-data\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.504271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-scripts\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.505728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.510150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a36dabb0-136d-40c7-b9b1-7174cd3ba355-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.513247 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a36dabb0-136d-40c7-b9b1-7174cd3ba355-ceph\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.515898 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvfg\" (UniqueName: \"kubernetes.io/projected/a36dabb0-136d-40c7-b9b1-7174cd3ba355-kube-api-access-lrvfg\") pod \"cinder-backup-0\" (UID: \"a36dabb0-136d-40c7-b9b1-7174cd3ba355\") " pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.621847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 29 02:48:30 crc kubenswrapper[4749]: I1129 02:48:30.926283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f1bad501-c950-4fad-b698-59f5ad3f3e63","Type":"ContainerStarted","Data":"80b75ec3f20cc45fcd10929f9c8d1a157db59d334451c62b3474f17310973d47"} Nov 29 02:48:31 crc kubenswrapper[4749]: I1129 02:48:31.224120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 29 02:48:31 crc kubenswrapper[4749]: W1129 02:48:31.251652 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda36dabb0_136d_40c7_b9b1_7174cd3ba355.slice/crio-dbfc45ab2f2e272fac7cd4f739d1c6cbb12dae8f8d52d84e29b9cdc62bc297b4 WatchSource:0}: Error finding container dbfc45ab2f2e272fac7cd4f739d1c6cbb12dae8f8d52d84e29b9cdc62bc297b4: Status 404 returned error can't find the container with id dbfc45ab2f2e272fac7cd4f739d1c6cbb12dae8f8d52d84e29b9cdc62bc297b4 Nov 29 02:48:31 crc kubenswrapper[4749]: I1129 02:48:31.938152 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a36dabb0-136d-40c7-b9b1-7174cd3ba355","Type":"ContainerStarted","Data":"dbfc45ab2f2e272fac7cd4f739d1c6cbb12dae8f8d52d84e29b9cdc62bc297b4"} Nov 29 02:48:31 crc kubenswrapper[4749]: I1129 02:48:31.940399 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f1bad501-c950-4fad-b698-59f5ad3f3e63","Type":"ContainerStarted","Data":"845f9d182fbc68427e9d45305ea677ae367c53220f4b944ec0a383f89740c05e"} Nov 29 02:48:31 crc kubenswrapper[4749]: I1129 02:48:31.940422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"f1bad501-c950-4fad-b698-59f5ad3f3e63","Type":"ContainerStarted","Data":"f5e6bb12faa4aae4db5f54250e9da9d373d704c36e158cfee50cd8293ec81aa5"} Nov 29 02:48:31 crc kubenswrapper[4749]: I1129 02:48:31.977564 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.138116305 podStartE2EDuration="2.977545457s" podCreationTimestamp="2025-11-29 02:48:29 +0000 UTC" firstStartedPulling="2025-11-29 02:48:30.463646372 +0000 UTC m=+5853.635796229" lastFinishedPulling="2025-11-29 02:48:31.303075514 +0000 UTC m=+5854.475225381" observedRunningTime="2025-11-29 02:48:31.969672956 +0000 UTC m=+5855.141822813" watchObservedRunningTime="2025-11-29 02:48:31.977545457 +0000 UTC m=+5855.149695314" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.210275 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.78:8776/healthcheck\": dial tcp 10.217.1.78:8776: connect: connection refused" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.497695 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.643445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-combined-ca-bundle\") pod \"aea6d65f-84ae-4ea9-8970-41d287e2f672\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.643763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data-custom\") pod \"aea6d65f-84ae-4ea9-8970-41d287e2f672\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.643836 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea6d65f-84ae-4ea9-8970-41d287e2f672-logs\") pod \"aea6d65f-84ae-4ea9-8970-41d287e2f672\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.643880 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-scripts\") pod \"aea6d65f-84ae-4ea9-8970-41d287e2f672\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.643959 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2r7p\" (UniqueName: \"kubernetes.io/projected/aea6d65f-84ae-4ea9-8970-41d287e2f672-kube-api-access-l2r7p\") pod \"aea6d65f-84ae-4ea9-8970-41d287e2f672\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.643983 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aea6d65f-84ae-4ea9-8970-41d287e2f672-etc-machine-id\") pod \"aea6d65f-84ae-4ea9-8970-41d287e2f672\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.644001 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data\") pod \"aea6d65f-84ae-4ea9-8970-41d287e2f672\" (UID: \"aea6d65f-84ae-4ea9-8970-41d287e2f672\") " Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.644568 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea6d65f-84ae-4ea9-8970-41d287e2f672-logs" (OuterVolumeSpecName: "logs") pod "aea6d65f-84ae-4ea9-8970-41d287e2f672" (UID: "aea6d65f-84ae-4ea9-8970-41d287e2f672"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.645366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea6d65f-84ae-4ea9-8970-41d287e2f672-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "aea6d65f-84ae-4ea9-8970-41d287e2f672" (UID: "aea6d65f-84ae-4ea9-8970-41d287e2f672"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.649589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aea6d65f-84ae-4ea9-8970-41d287e2f672" (UID: "aea6d65f-84ae-4ea9-8970-41d287e2f672"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.650325 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea6d65f-84ae-4ea9-8970-41d287e2f672-kube-api-access-l2r7p" (OuterVolumeSpecName: "kube-api-access-l2r7p") pod "aea6d65f-84ae-4ea9-8970-41d287e2f672" (UID: "aea6d65f-84ae-4ea9-8970-41d287e2f672"). InnerVolumeSpecName "kube-api-access-l2r7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.651835 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-scripts" (OuterVolumeSpecName: "scripts") pod "aea6d65f-84ae-4ea9-8970-41d287e2f672" (UID: "aea6d65f-84ae-4ea9-8970-41d287e2f672"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.674505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aea6d65f-84ae-4ea9-8970-41d287e2f672" (UID: "aea6d65f-84ae-4ea9-8970-41d287e2f672"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.714960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data" (OuterVolumeSpecName: "config-data") pod "aea6d65f-84ae-4ea9-8970-41d287e2f672" (UID: "aea6d65f-84ae-4ea9-8970-41d287e2f672"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.745881 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2r7p\" (UniqueName: \"kubernetes.io/projected/aea6d65f-84ae-4ea9-8970-41d287e2f672-kube-api-access-l2r7p\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.745918 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aea6d65f-84ae-4ea9-8970-41d287e2f672-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.745927 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.745936 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.745944 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.745952 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aea6d65f-84ae-4ea9-8970-41d287e2f672-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.745960 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea6d65f-84ae-4ea9-8970-41d287e2f672-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.758274 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.951767 4749 generic.go:334] "Generic (PLEG): container finished" podID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerID="d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057" exitCode=0 Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.951820 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.951840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aea6d65f-84ae-4ea9-8970-41d287e2f672","Type":"ContainerDied","Data":"d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057"} Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.952184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"aea6d65f-84ae-4ea9-8970-41d287e2f672","Type":"ContainerDied","Data":"95d48b23ad7301b8276dd6d0c9caa3fe2c8d375084014a3998d508e58d72f8a8"} Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.952214 4749 scope.go:117] "RemoveContainer" containerID="d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.955413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a36dabb0-136d-40c7-b9b1-7174cd3ba355","Type":"ContainerStarted","Data":"04d5df8e49135e1a1762ada8a7814ec240473c2d93a50effed46fc6706aba6db"} Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.955457 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a36dabb0-136d-40c7-b9b1-7174cd3ba355","Type":"ContainerStarted","Data":"20e67f111dd3d3717c7d21a5cad8d51ebc90289558b7184d423973387db5697a"} Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.983743 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.006076409 podStartE2EDuration="2.983727562s" podCreationTimestamp="2025-11-29 02:48:30 +0000 UTC" firstStartedPulling="2025-11-29 02:48:31.253859117 +0000 UTC m=+5854.426008974" lastFinishedPulling="2025-11-29 02:48:32.23151027 +0000 UTC m=+5855.403660127" observedRunningTime="2025-11-29 02:48:32.977639294 +0000 UTC m=+5856.149789151" watchObservedRunningTime="2025-11-29 02:48:32.983727562 +0000 UTC m=+5856.155877419" Nov 29 02:48:32 crc kubenswrapper[4749]: I1129 02:48:32.984031 4749 scope.go:117] "RemoveContainer" containerID="7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.000491 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.008098 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.027677 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:48:33 crc kubenswrapper[4749]: E1129 02:48:33.028135 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api-log" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.028184 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api-log" Nov 29 02:48:33 crc kubenswrapper[4749]: E1129 02:48:33.029269 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.029291 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.029466 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api-log" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.029501 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" containerName="cinder-api" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.030684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.036758 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.039597 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.060364 4749 scope.go:117] "RemoveContainer" containerID="d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057" Nov 29 02:48:33 crc kubenswrapper[4749]: E1129 02:48:33.068912 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057\": container with ID starting with d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057 not found: ID does not exist" containerID="d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.068955 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057"} err="failed to get container status \"d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057\": rpc error: code = NotFound desc = could not find container \"d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057\": container with ID starting with d45097cc003ba6d9620fad91f7ba2f6e7578f37a84c5ed48cf83942d17634057 not found: ID does not exist" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.068984 4749 scope.go:117] "RemoveContainer" containerID="7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c" Nov 29 02:48:33 crc kubenswrapper[4749]: E1129 02:48:33.069392 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c\": container with ID starting with 7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c not found: ID does not exist" containerID="7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.069429 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c"} err="failed to get container status \"7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c\": rpc error: code = NotFound desc = could not find container \"7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c\": container with ID starting with 7ea2ce2006b69f090e44630e064b165e2f28330a5afc8c73c579f2433c9f664c not found: ID does not exist" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.087230 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea6d65f-84ae-4ea9-8970-41d287e2f672" path="/var/lib/kubelet/pods/aea6d65f-84ae-4ea9-8970-41d287e2f672/volumes" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.154553 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17155680-b07e-462b-a201-4823c5613f54-logs\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.155150 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-config-data\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.155604 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17155680-b07e-462b-a201-4823c5613f54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.155678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cpgm\" (UniqueName: \"kubernetes.io/projected/17155680-b07e-462b-a201-4823c5613f54-kube-api-access-2cpgm\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.155776 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.156364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-config-data-custom\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.158143 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-scripts\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.253914 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.254377 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.257359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.260892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.260958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-config-data-custom\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.261075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-scripts\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.261131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17155680-b07e-462b-a201-4823c5613f54-logs\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.261189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-config-data\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.261370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17155680-b07e-462b-a201-4823c5613f54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.261412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cpgm\" (UniqueName: \"kubernetes.io/projected/17155680-b07e-462b-a201-4823c5613f54-kube-api-access-2cpgm\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.263719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/17155680-b07e-462b-a201-4823c5613f54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.265232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17155680-b07e-462b-a201-4823c5613f54-logs\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.266109 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.268427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-scripts\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.272378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-config-data-custom\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.272724 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.277099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17155680-b07e-462b-a201-4823c5613f54-config-data\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.303635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cpgm\" (UniqueName: \"kubernetes.io/projected/17155680-b07e-462b-a201-4823c5613f54-kube-api-access-2cpgm\") pod \"cinder-api-0\" (UID: \"17155680-b07e-462b-a201-4823c5613f54\") " pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.318157 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.321297 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.322226 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.374003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.877848 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 29 02:48:33 crc kubenswrapper[4749]: W1129 02:48:33.883340 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17155680_b07e_462b_a201_4823c5613f54.slice/crio-223703a13d04059aebce531d9b4c457ae4e073c62d21a813d8d77836f8907400 WatchSource:0}: Error finding container 223703a13d04059aebce531d9b4c457ae4e073c62d21a813d8d77836f8907400: Status 404 returned error can't find the container with id 223703a13d04059aebce531d9b4c457ae4e073c62d21a813d8d77836f8907400 Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.971916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17155680-b07e-462b-a201-4823c5613f54","Type":"ContainerStarted","Data":"223703a13d04059aebce531d9b4c457ae4e073c62d21a813d8d77836f8907400"} Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.972387 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.977164 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 02:48:33 crc kubenswrapper[4749]: I1129 02:48:33.978376 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 02:48:34 crc kubenswrapper[4749]: I1129 02:48:34.931392 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:35 crc kubenswrapper[4749]: I1129 02:48:34.995624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17155680-b07e-462b-a201-4823c5613f54","Type":"ContainerStarted","Data":"d51330b73a59ecfd769c2866db250dee16f8a8d0c64af958968e8d39eac42a2d"} Nov 29 02:48:35 crc kubenswrapper[4749]: I1129 02:48:35.622929 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Nov 29 02:48:36 crc kubenswrapper[4749]: I1129 02:48:36.019474 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"17155680-b07e-462b-a201-4823c5613f54","Type":"ContainerStarted","Data":"58d63e36a5d62764a1d2e243651b97ba7cff95b142e0b0ccfbca4b8572e44962"} Nov 29 02:48:36 crc kubenswrapper[4749]: I1129 02:48:36.019843 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 29 02:48:38 crc kubenswrapper[4749]: I1129 02:48:38.029406 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 29 02:48:38 crc kubenswrapper[4749]: I1129 02:48:38.056252 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.05622579 podStartE2EDuration="6.05622579s" podCreationTimestamp="2025-11-29 02:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:36.058036053 +0000 UTC m=+5859.230185920" watchObservedRunningTime="2025-11-29 02:48:38.05622579 +0000 UTC m=+5861.228375677" Nov 29 02:48:38 crc kubenswrapper[4749]: I1129 02:48:38.112368 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:38 crc kubenswrapper[4749]: I1129 02:48:38.112755 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerName="cinder-scheduler" containerID="cri-o://030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb" gracePeriod=30 Nov 29 02:48:38 crc kubenswrapper[4749]: I1129 02:48:38.112931 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerName="probe" containerID="cri-o://2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66" gracePeriod=30 Nov 29 02:48:39 crc kubenswrapper[4749]: I1129 02:48:39.058164 4749 generic.go:334] "Generic (PLEG): container finished" podID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerID="2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66" exitCode=0 Nov 29 02:48:39 crc kubenswrapper[4749]: I1129 02:48:39.058279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"093d202b-1b95-4ebc-932a-ce6062c3abac","Type":"ContainerDied","Data":"2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66"} Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.076347 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:48:40 crc kubenswrapper[4749]: E1129 02:48:40.076653 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.156848 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.567518 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.734923 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nddrp\" (UniqueName: \"kubernetes.io/projected/093d202b-1b95-4ebc-932a-ce6062c3abac-kube-api-access-nddrp\") pod \"093d202b-1b95-4ebc-932a-ce6062c3abac\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.735038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data\") pod \"093d202b-1b95-4ebc-932a-ce6062c3abac\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.735105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data-custom\") pod \"093d202b-1b95-4ebc-932a-ce6062c3abac\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.735172 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-combined-ca-bundle\") pod \"093d202b-1b95-4ebc-932a-ce6062c3abac\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.735273 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/093d202b-1b95-4ebc-932a-ce6062c3abac-etc-machine-id\") pod \"093d202b-1b95-4ebc-932a-ce6062c3abac\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.735300 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-scripts\") pod \"093d202b-1b95-4ebc-932a-ce6062c3abac\" (UID: \"093d202b-1b95-4ebc-932a-ce6062c3abac\") " Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.737312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/093d202b-1b95-4ebc-932a-ce6062c3abac-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "093d202b-1b95-4ebc-932a-ce6062c3abac" (UID: "093d202b-1b95-4ebc-932a-ce6062c3abac"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.749520 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-scripts" (OuterVolumeSpecName: "scripts") pod "093d202b-1b95-4ebc-932a-ce6062c3abac" (UID: "093d202b-1b95-4ebc-932a-ce6062c3abac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.749610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "093d202b-1b95-4ebc-932a-ce6062c3abac" (UID: "093d202b-1b95-4ebc-932a-ce6062c3abac"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.764237 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093d202b-1b95-4ebc-932a-ce6062c3abac-kube-api-access-nddrp" (OuterVolumeSpecName: "kube-api-access-nddrp") pod "093d202b-1b95-4ebc-932a-ce6062c3abac" (UID: "093d202b-1b95-4ebc-932a-ce6062c3abac"). InnerVolumeSpecName "kube-api-access-nddrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.825625 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093d202b-1b95-4ebc-932a-ce6062c3abac" (UID: "093d202b-1b95-4ebc-932a-ce6062c3abac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.837896 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nddrp\" (UniqueName: \"kubernetes.io/projected/093d202b-1b95-4ebc-932a-ce6062c3abac-kube-api-access-nddrp\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.837926 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.837938 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.837950 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/093d202b-1b95-4ebc-932a-ce6062c3abac-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.837960 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.846930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data" (OuterVolumeSpecName: "config-data") pod "093d202b-1b95-4ebc-932a-ce6062c3abac" (UID: "093d202b-1b95-4ebc-932a-ce6062c3abac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.851948 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Nov 29 02:48:40 crc kubenswrapper[4749]: I1129 02:48:40.940399 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093d202b-1b95-4ebc-932a-ce6062c3abac-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.077989 4749 generic.go:334] "Generic (PLEG): container finished" podID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerID="030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb" exitCode=0 Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.078061 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.087408 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"093d202b-1b95-4ebc-932a-ce6062c3abac","Type":"ContainerDied","Data":"030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb"} Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.087448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"093d202b-1b95-4ebc-932a-ce6062c3abac","Type":"ContainerDied","Data":"e20c29f5a6a6feadf0d96b26ae3b74db4250f03794063138ef1d3dd16cf73695"} Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.087464 4749 scope.go:117] "RemoveContainer" containerID="2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.130409 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.137115 4749 scope.go:117] "RemoveContainer" containerID="030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.147700 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.172646 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:41 crc kubenswrapper[4749]: E1129 02:48:41.173230 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerName="probe" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.173296 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerName="probe" Nov 29 02:48:41 crc kubenswrapper[4749]: E1129 02:48:41.173368 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerName="cinder-scheduler" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.173420 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerName="cinder-scheduler" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.184018 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerName="probe" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.184138 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" containerName="cinder-scheduler" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.185039 4749 scope.go:117] "RemoveContainer" containerID="2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.185873 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.186013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: E1129 02:48:41.186343 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66\": container with ID starting with 2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66 not found: ID does not exist" containerID="2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.186384 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66"} err="failed to get container status \"2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66\": rpc error: code = NotFound desc = could not find container \"2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66\": container with ID starting with 2b050f0c5fc27ac623f86e7cfe241f585cf61b711d3b6e335b9184c6717bfd66 not found: ID does not exist" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.186408 4749 scope.go:117] "RemoveContainer" containerID="030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb" Nov 29 02:48:41 crc kubenswrapper[4749]: E1129 02:48:41.186779 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb\": container with ID starting with 030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb not found: ID does not exist" containerID="030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.186794 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb"} err="failed to get container status \"030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb\": rpc error: code = NotFound desc = could not find container \"030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb\": container with ID starting with 030137af7a0045d8a10c465e618556b372931350043a3a332c8ede403254a2eb not found: ID does not exist" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.188254 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.349003 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.349041 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-scripts\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.349079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.349156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gsnz\" (UniqueName: \"kubernetes.io/projected/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-kube-api-access-7gsnz\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.349213 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-config-data\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.349427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.450673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-config-data\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.450748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.450789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.450805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-scripts\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.450836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.450881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gsnz\" (UniqueName: \"kubernetes.io/projected/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-kube-api-access-7gsnz\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.451167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.456655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-config-data\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.457301 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.462329 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-scripts\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.463121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.482481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gsnz\" (UniqueName: \"kubernetes.io/projected/b12b8c41-e630-4f71-bc3f-24fdd2b25a5c-kube-api-access-7gsnz\") pod \"cinder-scheduler-0\" (UID: \"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c\") " pod="openstack/cinder-scheduler-0" Nov 29 02:48:41 crc kubenswrapper[4749]: I1129 02:48:41.525828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 29 02:48:42 crc kubenswrapper[4749]: I1129 02:48:42.884636 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 29 02:48:42 crc kubenswrapper[4749]: W1129 02:48:42.918465 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb12b8c41_e630_4f71_bc3f_24fdd2b25a5c.slice/crio-aa6483ba3fcf1cd3c48f2a0574bee67fc5a43d21e3a83007e77c1ad6713ea914 WatchSource:0}: Error finding container aa6483ba3fcf1cd3c48f2a0574bee67fc5a43d21e3a83007e77c1ad6713ea914: Status 404 returned error can't find the container with id aa6483ba3fcf1cd3c48f2a0574bee67fc5a43d21e3a83007e77c1ad6713ea914 Nov 29 02:48:43 crc kubenswrapper[4749]: I1129 02:48:43.110441 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093d202b-1b95-4ebc-932a-ce6062c3abac" path="/var/lib/kubelet/pods/093d202b-1b95-4ebc-932a-ce6062c3abac/volumes" Nov 29 02:48:43 crc kubenswrapper[4749]: I1129 02:48:43.867264 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c","Type":"ContainerStarted","Data":"3559e465615f6fdaed8f96ef12987141bcd8ad03b0cf88d0be80774ee54fdab1"} Nov 29 02:48:43 crc kubenswrapper[4749]: I1129 02:48:43.867529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c","Type":"ContainerStarted","Data":"aa6483ba3fcf1cd3c48f2a0574bee67fc5a43d21e3a83007e77c1ad6713ea914"} Nov 29 02:48:44 crc kubenswrapper[4749]: I1129 02:48:44.880808 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b12b8c41-e630-4f71-bc3f-24fdd2b25a5c","Type":"ContainerStarted","Data":"17c7b691a702015021ab8927996eac92916abe55ce1a7dfd84e74cf67c6edb47"} Nov 29 02:48:44 crc kubenswrapper[4749]: I1129 02:48:44.908317 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.908296861 podStartE2EDuration="3.908296861s" podCreationTimestamp="2025-11-29 02:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:48:44.902716975 +0000 UTC m=+5868.074866872" watchObservedRunningTime="2025-11-29 02:48:44.908296861 +0000 UTC m=+5868.080446728" Nov 29 02:48:45 crc kubenswrapper[4749]: I1129 02:48:45.290513 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 29 02:48:46 crc kubenswrapper[4749]: I1129 02:48:46.526531 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 29 02:48:51 crc kubenswrapper[4749]: I1129 02:48:51.761483 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 29 02:48:54 crc kubenswrapper[4749]: I1129 02:48:54.075553 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:48:54 crc kubenswrapper[4749]: E1129 02:48:54.076414 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:49:05 crc kubenswrapper[4749]: I1129 02:49:05.074783 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:49:05 crc kubenswrapper[4749]: E1129 02:49:05.075389 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:49:19 crc kubenswrapper[4749]: I1129 02:49:19.075472 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:49:19 crc kubenswrapper[4749]: E1129 02:49:19.076641 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:49:30 crc kubenswrapper[4749]: I1129 02:49:30.075494 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:49:30 crc kubenswrapper[4749]: E1129 02:49:30.076127 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.526901 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6zvqn"] Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.530896 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.545137 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6zvqn"] Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.677851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-utilities\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.677932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btjxg\" (UniqueName: \"kubernetes.io/projected/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-kube-api-access-btjxg\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.678097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-catalog-content\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.780511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-utilities\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.780670 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btjxg\" (UniqueName: \"kubernetes.io/projected/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-kube-api-access-btjxg\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.780845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-catalog-content\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.781318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-utilities\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.781402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-catalog-content\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.799993 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btjxg\" (UniqueName: \"kubernetes.io/projected/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-kube-api-access-btjxg\") pod \"redhat-operators-6zvqn\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:31 crc kubenswrapper[4749]: I1129 02:49:31.877060 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:32 crc kubenswrapper[4749]: I1129 02:49:32.352377 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6zvqn"] Nov 29 02:49:32 crc kubenswrapper[4749]: I1129 02:49:32.488269 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zvqn" event={"ID":"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc","Type":"ContainerStarted","Data":"6741217f9248817b035f2d147bf49a1853f49512784609f1d4663a3af63f883b"} Nov 29 02:49:33 crc kubenswrapper[4749]: I1129 02:49:33.506688 4749 generic.go:334] "Generic (PLEG): container finished" podID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerID="4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d" exitCode=0 Nov 29 02:49:33 crc kubenswrapper[4749]: I1129 02:49:33.506825 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zvqn" event={"ID":"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc","Type":"ContainerDied","Data":"4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d"} Nov 29 02:49:34 crc kubenswrapper[4749]: I1129 02:49:34.524245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zvqn" event={"ID":"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc","Type":"ContainerStarted","Data":"5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed"} Nov 29 02:49:36 crc kubenswrapper[4749]: I1129 02:49:36.555963 4749 generic.go:334] "Generic (PLEG): container finished" podID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerID="5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed" exitCode=0 Nov 29 02:49:36 crc kubenswrapper[4749]: I1129 02:49:36.556089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zvqn" event={"ID":"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc","Type":"ContainerDied","Data":"5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed"} Nov 29 02:49:37 crc kubenswrapper[4749]: I1129 02:49:37.569967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zvqn" event={"ID":"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc","Type":"ContainerStarted","Data":"89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f"} Nov 29 02:49:37 crc kubenswrapper[4749]: I1129 02:49:37.602767 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6zvqn" podStartSLOduration=3.10976457 podStartE2EDuration="6.602737077s" podCreationTimestamp="2025-11-29 02:49:31 +0000 UTC" firstStartedPulling="2025-11-29 02:49:33.50990734 +0000 UTC m=+5916.682057237" lastFinishedPulling="2025-11-29 02:49:37.002879847 +0000 UTC m=+5920.175029744" observedRunningTime="2025-11-29 02:49:37.59627949 +0000 UTC m=+5920.768429407" watchObservedRunningTime="2025-11-29 02:49:37.602737077 +0000 UTC m=+5920.774886974" Nov 29 02:49:41 crc kubenswrapper[4749]: I1129 02:49:41.878118 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:41 crc kubenswrapper[4749]: I1129 02:49:41.878985 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:42 crc kubenswrapper[4749]: I1129 02:49:42.926683 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6zvqn" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="registry-server" probeResult="failure" output=< Nov 29 02:49:42 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 02:49:42 crc kubenswrapper[4749]: > Nov 29 02:49:45 crc kubenswrapper[4749]: I1129 02:49:45.076080 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:49:45 crc kubenswrapper[4749]: E1129 02:49:45.076558 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:49:51 crc kubenswrapper[4749]: I1129 02:49:51.963175 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:52 crc kubenswrapper[4749]: I1129 02:49:52.043506 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:52 crc kubenswrapper[4749]: I1129 02:49:52.220149 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6zvqn"] Nov 29 02:49:53 crc kubenswrapper[4749]: I1129 02:49:53.755385 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6zvqn" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="registry-server" containerID="cri-o://89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f" gracePeriod=2 Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.315020 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.485610 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btjxg\" (UniqueName: \"kubernetes.io/projected/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-kube-api-access-btjxg\") pod \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.485806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-catalog-content\") pod \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.485935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-utilities\") pod \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\" (UID: \"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc\") " Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.487752 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-utilities" (OuterVolumeSpecName: "utilities") pod "f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" (UID: "f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.498578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-kube-api-access-btjxg" (OuterVolumeSpecName: "kube-api-access-btjxg") pod "f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" (UID: "f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc"). InnerVolumeSpecName "kube-api-access-btjxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.588405 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btjxg\" (UniqueName: \"kubernetes.io/projected/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-kube-api-access-btjxg\") on node \"crc\" DevicePath \"\"" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.588441 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.669286 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" (UID: "f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.689690 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.766051 4749 generic.go:334] "Generic (PLEG): container finished" podID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerID="89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f" exitCode=0 Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.766093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zvqn" event={"ID":"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc","Type":"ContainerDied","Data":"89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f"} Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.766118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zvqn" event={"ID":"f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc","Type":"ContainerDied","Data":"6741217f9248817b035f2d147bf49a1853f49512784609f1d4663a3af63f883b"} Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.766133 4749 scope.go:117] "RemoveContainer" containerID="89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.766263 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zvqn" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.807973 4749 scope.go:117] "RemoveContainer" containerID="5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.810264 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6zvqn"] Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.822610 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6zvqn"] Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.837119 4749 scope.go:117] "RemoveContainer" containerID="4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.869254 4749 scope.go:117] "RemoveContainer" containerID="89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f" Nov 29 02:49:54 crc kubenswrapper[4749]: E1129 02:49:54.869719 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f\": container with ID starting with 89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f not found: ID does not exist" containerID="89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.869755 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f"} err="failed to get container status \"89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f\": rpc error: code = NotFound desc = could not find container \"89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f\": container with ID starting with 89157846db333c000d796d0001fd59614c718659f500b14e3c5fb4920ac3df3f not found: ID does not exist" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.869782 4749 scope.go:117] "RemoveContainer" containerID="5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed" Nov 29 02:49:54 crc kubenswrapper[4749]: E1129 02:49:54.870117 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed\": container with ID starting with 5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed not found: ID does not exist" containerID="5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.870140 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed"} err="failed to get container status \"5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed\": rpc error: code = NotFound desc = could not find container \"5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed\": container with ID starting with 5a0702d583e06e135164c7e3cb01f6a6b2c8384e9641be49071d6399a5efc7ed not found: ID does not exist" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.870153 4749 scope.go:117] "RemoveContainer" containerID="4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d" Nov 29 02:49:54 crc kubenswrapper[4749]: E1129 02:49:54.870478 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d\": container with ID starting with 4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d not found: ID does not exist" containerID="4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d" Nov 29 02:49:54 crc kubenswrapper[4749]: I1129 02:49:54.870506 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d"} err="failed to get container status \"4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d\": rpc error: code = NotFound desc = could not find container \"4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d\": container with ID starting with 4ee3a618d4146948bc920e336de28a69e031d24d9127ede7410613333fe3296d not found: ID does not exist" Nov 29 02:49:55 crc kubenswrapper[4749]: I1129 02:49:55.095433 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" path="/var/lib/kubelet/pods/f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc/volumes" Nov 29 02:49:57 crc kubenswrapper[4749]: I1129 02:49:57.089117 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:49:57 crc kubenswrapper[4749]: E1129 02:49:57.091544 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:50:11 crc kubenswrapper[4749]: I1129 02:50:11.075434 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:50:11 crc kubenswrapper[4749]: E1129 02:50:11.076872 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:50:26 crc kubenswrapper[4749]: I1129 02:50:26.075282 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:50:26 crc kubenswrapper[4749]: E1129 02:50:26.076846 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:50:32 crc kubenswrapper[4749]: I1129 02:50:32.085658 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7lw9x"] Nov 29 02:50:32 crc kubenswrapper[4749]: I1129 02:50:32.095778 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7lw9x"] Nov 29 02:50:32 crc kubenswrapper[4749]: I1129 02:50:32.129990 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8dd6-account-create-update-xf568"] Nov 29 02:50:32 crc kubenswrapper[4749]: I1129 02:50:32.146578 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8dd6-account-create-update-xf568"] Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.095117 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e7adc6b-04e8-47a2-a570-c8e37a608860" path="/var/lib/kubelet/pods/1e7adc6b-04e8-47a2-a570-c8e37a608860/volumes" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.096636 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db62156-8d63-4864-a706-67ebdab6170c" path="/var/lib/kubelet/pods/8db62156-8d63-4864-a706-67ebdab6170c/volumes" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.744686 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-z2brk"] Nov 29 02:50:33 crc kubenswrapper[4749]: E1129 02:50:33.745220 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="extract-content" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.745239 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="extract-content" Nov 29 02:50:33 crc kubenswrapper[4749]: E1129 02:50:33.745262 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="extract-utilities" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.745269 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="extract-utilities" Nov 29 02:50:33 crc kubenswrapper[4749]: E1129 02:50:33.745302 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="registry-server" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.745311 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="registry-server" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.745546 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0184f4b-c18b-4210-ab67-8d0cbb9a4ebc" containerName="registry-server" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.746483 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.749300 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-x5pfq" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.763636 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.767082 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-7bnqw"] Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.769326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.795431 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-z2brk"] Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.835261 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7bnqw"] Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.895838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-scripts\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.895934 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-run-ovn\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.895964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvsxn\" (UniqueName: \"kubernetes.io/projected/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-kube-api-access-bvsxn\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.896024 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-etc-ovs\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.896068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-lib\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.896108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-scripts\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.896186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-run\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.896228 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-log\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.896255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-log-ovn\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.896522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vv9\" (UniqueName: \"kubernetes.io/projected/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-kube-api-access-46vv9\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.896683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-run\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-etc-ovs\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-lib\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-scripts\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-run\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-log\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-log-ovn\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vv9\" (UniqueName: \"kubernetes.io/projected/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-kube-api-access-46vv9\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-run\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998931 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-scripts\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998954 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-run-ovn\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.998976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvsxn\" (UniqueName: \"kubernetes.io/projected/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-kube-api-access-bvsxn\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.999443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-lib\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.999443 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-run-ovn\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.999499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-log-ovn\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.999503 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-etc-ovs\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.999463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-run\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.999773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-var-log\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:33 crc kubenswrapper[4749]: I1129 02:50:33.999950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-var-run\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:34 crc kubenswrapper[4749]: I1129 02:50:34.001502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-scripts\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:34 crc kubenswrapper[4749]: I1129 02:50:34.001520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-scripts\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:34 crc kubenswrapper[4749]: I1129 02:50:34.017321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvsxn\" (UniqueName: \"kubernetes.io/projected/2d20ab0c-d80d-4dd5-98d2-8f09ec505527-kube-api-access-bvsxn\") pod \"ovn-controller-ovs-7bnqw\" (UID: \"2d20ab0c-d80d-4dd5-98d2-8f09ec505527\") " pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:34 crc kubenswrapper[4749]: I1129 02:50:34.017446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vv9\" (UniqueName: \"kubernetes.io/projected/0cffba40-2751-4ab1-a9b5-9c8c041a83f8-kube-api-access-46vv9\") pod \"ovn-controller-z2brk\" (UID: \"0cffba40-2751-4ab1-a9b5-9c8c041a83f8\") " pod="openstack/ovn-controller-z2brk" Nov 29 02:50:34 crc kubenswrapper[4749]: I1129 02:50:34.090809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z2brk" Nov 29 02:50:34 crc kubenswrapper[4749]: I1129 02:50:34.139331 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:34 crc kubenswrapper[4749]: I1129 02:50:34.567380 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-z2brk"] Nov 29 02:50:34 crc kubenswrapper[4749]: W1129 02:50:34.575058 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cffba40_2751_4ab1_a9b5_9c8c041a83f8.slice/crio-47183fec82d5e4eed40a142d3ecbba66f7fa80a3e8d6dbbc7448c1cc241fa638 WatchSource:0}: Error finding container 47183fec82d5e4eed40a142d3ecbba66f7fa80a3e8d6dbbc7448c1cc241fa638: Status 404 returned error can't find the container with id 47183fec82d5e4eed40a142d3ecbba66f7fa80a3e8d6dbbc7448c1cc241fa638 Nov 29 02:50:34 crc kubenswrapper[4749]: W1129 02:50:34.987009 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d20ab0c_d80d_4dd5_98d2_8f09ec505527.slice/crio-618aacd97d7a31a4c4d7b3a12b10836eb2b881d2894582e91ba9b8b5110ba45c WatchSource:0}: Error finding container 618aacd97d7a31a4c4d7b3a12b10836eb2b881d2894582e91ba9b8b5110ba45c: Status 404 returned error can't find the container with id 618aacd97d7a31a4c4d7b3a12b10836eb2b881d2894582e91ba9b8b5110ba45c Nov 29 02:50:34 crc kubenswrapper[4749]: I1129 02:50:34.991942 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-7bnqw"] Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.254786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z2brk" event={"ID":"0cffba40-2751-4ab1-a9b5-9c8c041a83f8","Type":"ContainerStarted","Data":"f2cc4cbebcdabf6c5da372c834abd2b2155692a256e91b33c9f7cca71020823f"} Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.254855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z2brk" event={"ID":"0cffba40-2751-4ab1-a9b5-9c8c041a83f8","Type":"ContainerStarted","Data":"47183fec82d5e4eed40a142d3ecbba66f7fa80a3e8d6dbbc7448c1cc241fa638"} Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.254926 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-z2brk" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.257608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7bnqw" event={"ID":"2d20ab0c-d80d-4dd5-98d2-8f09ec505527","Type":"ContainerStarted","Data":"618aacd97d7a31a4c4d7b3a12b10836eb2b881d2894582e91ba9b8b5110ba45c"} Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.274156 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-z2brk" podStartSLOduration=2.27413833 podStartE2EDuration="2.27413833s" podCreationTimestamp="2025-11-29 02:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:50:35.273837562 +0000 UTC m=+5978.445987459" watchObservedRunningTime="2025-11-29 02:50:35.27413833 +0000 UTC m=+5978.446288207" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.369325 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6fbp7"] Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.370791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.379998 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.393565 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6fbp7"] Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.436755 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnf8\" (UniqueName: \"kubernetes.io/projected/4af332b0-069f-4cf1-976a-076483cfe432-kube-api-access-hpnf8\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.436871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4af332b0-069f-4cf1-976a-076483cfe432-ovn-rundir\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.436919 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af332b0-069f-4cf1-976a-076483cfe432-config\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.436940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4af332b0-069f-4cf1-976a-076483cfe432-ovs-rundir\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.538311 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4af332b0-069f-4cf1-976a-076483cfe432-ovn-rundir\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.538380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af332b0-069f-4cf1-976a-076483cfe432-config\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.538398 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4af332b0-069f-4cf1-976a-076483cfe432-ovs-rundir\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.538455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnf8\" (UniqueName: \"kubernetes.io/projected/4af332b0-069f-4cf1-976a-076483cfe432-kube-api-access-hpnf8\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.538702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4af332b0-069f-4cf1-976a-076483cfe432-ovn-rundir\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.538797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4af332b0-069f-4cf1-976a-076483cfe432-ovs-rundir\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.539336 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af332b0-069f-4cf1-976a-076483cfe432-config\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.565580 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnf8\" (UniqueName: \"kubernetes.io/projected/4af332b0-069f-4cf1-976a-076483cfe432-kube-api-access-hpnf8\") pod \"ovn-controller-metrics-6fbp7\" (UID: \"4af332b0-069f-4cf1-976a-076483cfe432\") " pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.703078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6fbp7" Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.992783 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-rr527"] Nov 29 02:50:35 crc kubenswrapper[4749]: I1129 02:50:35.994759 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rr527" Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.003587 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-rr527"] Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.048500 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf870051-768d-41f5-93bb-0f977eb2ae14-operator-scripts\") pod \"octavia-db-create-rr527\" (UID: \"bf870051-768d-41f5-93bb-0f977eb2ae14\") " pod="openstack/octavia-db-create-rr527" Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.048698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrtxh\" (UniqueName: \"kubernetes.io/projected/bf870051-768d-41f5-93bb-0f977eb2ae14-kube-api-access-hrtxh\") pod \"octavia-db-create-rr527\" (UID: \"bf870051-768d-41f5-93bb-0f977eb2ae14\") " pod="openstack/octavia-db-create-rr527" Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.150718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf870051-768d-41f5-93bb-0f977eb2ae14-operator-scripts\") pod \"octavia-db-create-rr527\" (UID: \"bf870051-768d-41f5-93bb-0f977eb2ae14\") " pod="openstack/octavia-db-create-rr527" Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.151157 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrtxh\" (UniqueName: \"kubernetes.io/projected/bf870051-768d-41f5-93bb-0f977eb2ae14-kube-api-access-hrtxh\") pod \"octavia-db-create-rr527\" (UID: \"bf870051-768d-41f5-93bb-0f977eb2ae14\") " pod="openstack/octavia-db-create-rr527" Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.151755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf870051-768d-41f5-93bb-0f977eb2ae14-operator-scripts\") pod \"octavia-db-create-rr527\" (UID: \"bf870051-768d-41f5-93bb-0f977eb2ae14\") " pod="openstack/octavia-db-create-rr527" Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.176055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6fbp7"] Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.181392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrtxh\" (UniqueName: \"kubernetes.io/projected/bf870051-768d-41f5-93bb-0f977eb2ae14-kube-api-access-hrtxh\") pod \"octavia-db-create-rr527\" (UID: \"bf870051-768d-41f5-93bb-0f977eb2ae14\") " pod="openstack/octavia-db-create-rr527" Nov 29 02:50:36 crc kubenswrapper[4749]: W1129 02:50:36.193526 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4af332b0_069f_4cf1_976a_076483cfe432.slice/crio-c4be52b1f4d307763d71d325d294c5d5d5c46651114c16bb89a83b41c3183cf3 WatchSource:0}: Error finding container c4be52b1f4d307763d71d325d294c5d5d5c46651114c16bb89a83b41c3183cf3: Status 404 returned error can't find the container with id c4be52b1f4d307763d71d325d294c5d5d5c46651114c16bb89a83b41c3183cf3 Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.276235 4749 generic.go:334] "Generic (PLEG): container finished" podID="2d20ab0c-d80d-4dd5-98d2-8f09ec505527" containerID="c215a0b37b17471604b5b3f6a940896763d8c60cd90beafb8a59e8b58c0e87bc" exitCode=0 Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.278028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7bnqw" event={"ID":"2d20ab0c-d80d-4dd5-98d2-8f09ec505527","Type":"ContainerDied","Data":"c215a0b37b17471604b5b3f6a940896763d8c60cd90beafb8a59e8b58c0e87bc"} Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.287151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6fbp7" event={"ID":"4af332b0-069f-4cf1-976a-076483cfe432","Type":"ContainerStarted","Data":"c4be52b1f4d307763d71d325d294c5d5d5c46651114c16bb89a83b41c3183cf3"} Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.310354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rr527" Nov 29 02:50:36 crc kubenswrapper[4749]: I1129 02:50:36.820918 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-rr527"] Nov 29 02:50:36 crc kubenswrapper[4749]: W1129 02:50:36.826439 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf870051_768d_41f5_93bb_0f977eb2ae14.slice/crio-a70176eb4424312241d3ac15916bbd9f57843d75e97e4eb5c30685a86832aacb WatchSource:0}: Error finding container a70176eb4424312241d3ac15916bbd9f57843d75e97e4eb5c30685a86832aacb: Status 404 returned error can't find the container with id a70176eb4424312241d3ac15916bbd9f57843d75e97e4eb5c30685a86832aacb Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.279042 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-cb81-account-create-update-mvmq8"] Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.281306 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.283187 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.304353 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-cb81-account-create-update-mvmq8"] Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.307657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7bnqw" event={"ID":"2d20ab0c-d80d-4dd5-98d2-8f09ec505527","Type":"ContainerStarted","Data":"436ae015ac8ab28c2add8e4fc72a88afd6202f6f7c9bacf23421401b21824eca"} Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.307703 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.307716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-7bnqw" event={"ID":"2d20ab0c-d80d-4dd5-98d2-8f09ec505527","Type":"ContainerStarted","Data":"7f4c84632c7d24d55b8396bcd8ac8be9505b907179a394c78b946529e3f477a7"} Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.307735 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.310491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6fbp7" event={"ID":"4af332b0-069f-4cf1-976a-076483cfe432","Type":"ContainerStarted","Data":"31ca324775018c01b58cff3c16f558688436632720323c94f2d9b3b9dca23f66"} Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.313574 4749 generic.go:334] "Generic (PLEG): container finished" podID="bf870051-768d-41f5-93bb-0f977eb2ae14" containerID="9f3bd2523993ca220cd187bec20bbccb45f18e29e61d18375d06104e85698abc" exitCode=0 Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.313615 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rr527" event={"ID":"bf870051-768d-41f5-93bb-0f977eb2ae14","Type":"ContainerDied","Data":"9f3bd2523993ca220cd187bec20bbccb45f18e29e61d18375d06104e85698abc"} Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.313639 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rr527" event={"ID":"bf870051-768d-41f5-93bb-0f977eb2ae14","Type":"ContainerStarted","Data":"a70176eb4424312241d3ac15916bbd9f57843d75e97e4eb5c30685a86832aacb"} Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.340155 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-7bnqw" podStartSLOduration=4.340131885 podStartE2EDuration="4.340131885s" podCreationTimestamp="2025-11-29 02:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:50:37.329114207 +0000 UTC m=+5980.501264074" watchObservedRunningTime="2025-11-29 02:50:37.340131885 +0000 UTC m=+5980.512281742" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.384344 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6fbp7" podStartSLOduration=2.384327609 podStartE2EDuration="2.384327609s" podCreationTimestamp="2025-11-29 02:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:50:37.363775839 +0000 UTC m=+5980.535925706" watchObservedRunningTime="2025-11-29 02:50:37.384327609 +0000 UTC m=+5980.556477466" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.384723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ddb009-1193-414f-9d3d-2e43e6eef806-operator-scripts\") pod \"octavia-cb81-account-create-update-mvmq8\" (UID: \"04ddb009-1193-414f-9d3d-2e43e6eef806\") " pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.384885 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r557j\" (UniqueName: \"kubernetes.io/projected/04ddb009-1193-414f-9d3d-2e43e6eef806-kube-api-access-r557j\") pod \"octavia-cb81-account-create-update-mvmq8\" (UID: \"04ddb009-1193-414f-9d3d-2e43e6eef806\") " pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.486533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r557j\" (UniqueName: \"kubernetes.io/projected/04ddb009-1193-414f-9d3d-2e43e6eef806-kube-api-access-r557j\") pod \"octavia-cb81-account-create-update-mvmq8\" (UID: \"04ddb009-1193-414f-9d3d-2e43e6eef806\") " pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.486652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ddb009-1193-414f-9d3d-2e43e6eef806-operator-scripts\") pod \"octavia-cb81-account-create-update-mvmq8\" (UID: \"04ddb009-1193-414f-9d3d-2e43e6eef806\") " pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.487424 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ddb009-1193-414f-9d3d-2e43e6eef806-operator-scripts\") pod \"octavia-cb81-account-create-update-mvmq8\" (UID: \"04ddb009-1193-414f-9d3d-2e43e6eef806\") " pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.527930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r557j\" (UniqueName: \"kubernetes.io/projected/04ddb009-1193-414f-9d3d-2e43e6eef806-kube-api-access-r557j\") pod \"octavia-cb81-account-create-update-mvmq8\" (UID: \"04ddb009-1193-414f-9d3d-2e43e6eef806\") " pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:37 crc kubenswrapper[4749]: I1129 02:50:37.605968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.075893 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:50:38 crc kubenswrapper[4749]: E1129 02:50:38.076788 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:50:38 crc kubenswrapper[4749]: W1129 02:50:38.100899 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ddb009_1193_414f_9d3d_2e43e6eef806.slice/crio-e3a569e54169c9d20cd8d7362463fdd379589f950999d7059e76619b8e0a97cb WatchSource:0}: Error finding container e3a569e54169c9d20cd8d7362463fdd379589f950999d7059e76619b8e0a97cb: Status 404 returned error can't find the container with id e3a569e54169c9d20cd8d7362463fdd379589f950999d7059e76619b8e0a97cb Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.105380 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-cb81-account-create-update-mvmq8"] Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.329787 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cb81-account-create-update-mvmq8" event={"ID":"04ddb009-1193-414f-9d3d-2e43e6eef806","Type":"ContainerStarted","Data":"e3a569e54169c9d20cd8d7362463fdd379589f950999d7059e76619b8e0a97cb"} Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.782885 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rr527" Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.921727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrtxh\" (UniqueName: \"kubernetes.io/projected/bf870051-768d-41f5-93bb-0f977eb2ae14-kube-api-access-hrtxh\") pod \"bf870051-768d-41f5-93bb-0f977eb2ae14\" (UID: \"bf870051-768d-41f5-93bb-0f977eb2ae14\") " Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.922029 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf870051-768d-41f5-93bb-0f977eb2ae14-operator-scripts\") pod \"bf870051-768d-41f5-93bb-0f977eb2ae14\" (UID: \"bf870051-768d-41f5-93bb-0f977eb2ae14\") " Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.923267 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf870051-768d-41f5-93bb-0f977eb2ae14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf870051-768d-41f5-93bb-0f977eb2ae14" (UID: "bf870051-768d-41f5-93bb-0f977eb2ae14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.925299 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf870051-768d-41f5-93bb-0f977eb2ae14-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:50:38 crc kubenswrapper[4749]: I1129 02:50:38.931620 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf870051-768d-41f5-93bb-0f977eb2ae14-kube-api-access-hrtxh" (OuterVolumeSpecName: "kube-api-access-hrtxh") pod "bf870051-768d-41f5-93bb-0f977eb2ae14" (UID: "bf870051-768d-41f5-93bb-0f977eb2ae14"). InnerVolumeSpecName "kube-api-access-hrtxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:50:39 crc kubenswrapper[4749]: I1129 02:50:39.027665 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrtxh\" (UniqueName: \"kubernetes.io/projected/bf870051-768d-41f5-93bb-0f977eb2ae14-kube-api-access-hrtxh\") on node \"crc\" DevicePath \"\"" Nov 29 02:50:39 crc kubenswrapper[4749]: I1129 02:50:39.059968 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-c88sj"] Nov 29 02:50:39 crc kubenswrapper[4749]: I1129 02:50:39.099387 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-c88sj"] Nov 29 02:50:39 crc kubenswrapper[4749]: I1129 02:50:39.338063 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rr527" event={"ID":"bf870051-768d-41f5-93bb-0f977eb2ae14","Type":"ContainerDied","Data":"a70176eb4424312241d3ac15916bbd9f57843d75e97e4eb5c30685a86832aacb"} Nov 29 02:50:39 crc kubenswrapper[4749]: I1129 02:50:39.338108 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a70176eb4424312241d3ac15916bbd9f57843d75e97e4eb5c30685a86832aacb" Nov 29 02:50:39 crc kubenswrapper[4749]: I1129 02:50:39.338146 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rr527" Nov 29 02:50:39 crc kubenswrapper[4749]: I1129 02:50:39.339848 4749 generic.go:334] "Generic (PLEG): container finished" podID="04ddb009-1193-414f-9d3d-2e43e6eef806" containerID="bc91c4659f52834f20f498c7df37cd086edb1cb966ea6bd1d722b8843c472a18" exitCode=0 Nov 29 02:50:39 crc kubenswrapper[4749]: I1129 02:50:39.339953 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cb81-account-create-update-mvmq8" event={"ID":"04ddb009-1193-414f-9d3d-2e43e6eef806","Type":"ContainerDied","Data":"bc91c4659f52834f20f498c7df37cd086edb1cb966ea6bd1d722b8843c472a18"} Nov 29 02:50:40 crc kubenswrapper[4749]: I1129 02:50:40.726727 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:40 crc kubenswrapper[4749]: I1129 02:50:40.866784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ddb009-1193-414f-9d3d-2e43e6eef806-operator-scripts\") pod \"04ddb009-1193-414f-9d3d-2e43e6eef806\" (UID: \"04ddb009-1193-414f-9d3d-2e43e6eef806\") " Nov 29 02:50:40 crc kubenswrapper[4749]: I1129 02:50:40.867456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r557j\" (UniqueName: \"kubernetes.io/projected/04ddb009-1193-414f-9d3d-2e43e6eef806-kube-api-access-r557j\") pod \"04ddb009-1193-414f-9d3d-2e43e6eef806\" (UID: \"04ddb009-1193-414f-9d3d-2e43e6eef806\") " Nov 29 02:50:40 crc kubenswrapper[4749]: I1129 02:50:40.867749 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04ddb009-1193-414f-9d3d-2e43e6eef806-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04ddb009-1193-414f-9d3d-2e43e6eef806" (UID: "04ddb009-1193-414f-9d3d-2e43e6eef806"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:50:40 crc kubenswrapper[4749]: I1129 02:50:40.868176 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04ddb009-1193-414f-9d3d-2e43e6eef806-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:50:40 crc kubenswrapper[4749]: I1129 02:50:40.875347 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ddb009-1193-414f-9d3d-2e43e6eef806-kube-api-access-r557j" (OuterVolumeSpecName: "kube-api-access-r557j") pod "04ddb009-1193-414f-9d3d-2e43e6eef806" (UID: "04ddb009-1193-414f-9d3d-2e43e6eef806"). InnerVolumeSpecName "kube-api-access-r557j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:50:40 crc kubenswrapper[4749]: I1129 02:50:40.970327 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r557j\" (UniqueName: \"kubernetes.io/projected/04ddb009-1193-414f-9d3d-2e43e6eef806-kube-api-access-r557j\") on node \"crc\" DevicePath \"\"" Nov 29 02:50:41 crc kubenswrapper[4749]: I1129 02:50:41.101435 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a77337-373b-4b13-93d4-8f338b35f511" path="/var/lib/kubelet/pods/31a77337-373b-4b13-93d4-8f338b35f511/volumes" Nov 29 02:50:41 crc kubenswrapper[4749]: I1129 02:50:41.367624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-cb81-account-create-update-mvmq8" event={"ID":"04ddb009-1193-414f-9d3d-2e43e6eef806","Type":"ContainerDied","Data":"e3a569e54169c9d20cd8d7362463fdd379589f950999d7059e76619b8e0a97cb"} Nov 29 02:50:41 crc kubenswrapper[4749]: I1129 02:50:41.367861 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a569e54169c9d20cd8d7362463fdd379589f950999d7059e76619b8e0a97cb" Nov 29 02:50:41 crc kubenswrapper[4749]: I1129 02:50:41.368046 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-cb81-account-create-update-mvmq8" Nov 29 02:50:42 crc kubenswrapper[4749]: I1129 02:50:42.988288 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-tpwtc"] Nov 29 02:50:42 crc kubenswrapper[4749]: E1129 02:50:42.989950 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ddb009-1193-414f-9d3d-2e43e6eef806" containerName="mariadb-account-create-update" Nov 29 02:50:42 crc kubenswrapper[4749]: I1129 02:50:42.990002 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ddb009-1193-414f-9d3d-2e43e6eef806" containerName="mariadb-account-create-update" Nov 29 02:50:42 crc kubenswrapper[4749]: E1129 02:50:42.990036 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf870051-768d-41f5-93bb-0f977eb2ae14" containerName="mariadb-database-create" Nov 29 02:50:42 crc kubenswrapper[4749]: I1129 02:50:42.990050 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf870051-768d-41f5-93bb-0f977eb2ae14" containerName="mariadb-database-create" Nov 29 02:50:42 crc kubenswrapper[4749]: I1129 02:50:42.990452 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ddb009-1193-414f-9d3d-2e43e6eef806" containerName="mariadb-account-create-update" Nov 29 02:50:42 crc kubenswrapper[4749]: I1129 02:50:42.990504 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf870051-768d-41f5-93bb-0f977eb2ae14" containerName="mariadb-database-create" Nov 29 02:50:42 crc kubenswrapper[4749]: I1129 02:50:42.991683 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.000820 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-tpwtc"] Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.118633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb95d3b-fab4-4a91-936f-98bfc00dc782-operator-scripts\") pod \"octavia-persistence-db-create-tpwtc\" (UID: \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\") " pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.119119 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsgpm\" (UniqueName: \"kubernetes.io/projected/ddb95d3b-fab4-4a91-936f-98bfc00dc782-kube-api-access-hsgpm\") pod \"octavia-persistence-db-create-tpwtc\" (UID: \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\") " pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.221463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsgpm\" (UniqueName: \"kubernetes.io/projected/ddb95d3b-fab4-4a91-936f-98bfc00dc782-kube-api-access-hsgpm\") pod \"octavia-persistence-db-create-tpwtc\" (UID: \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\") " pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.221614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb95d3b-fab4-4a91-936f-98bfc00dc782-operator-scripts\") pod \"octavia-persistence-db-create-tpwtc\" (UID: \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\") " pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.223008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb95d3b-fab4-4a91-936f-98bfc00dc782-operator-scripts\") pod \"octavia-persistence-db-create-tpwtc\" (UID: \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\") " pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.245378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsgpm\" (UniqueName: \"kubernetes.io/projected/ddb95d3b-fab4-4a91-936f-98bfc00dc782-kube-api-access-hsgpm\") pod \"octavia-persistence-db-create-tpwtc\" (UID: \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\") " pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.310827 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.681728 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-53ad-account-create-update-s9jwg"] Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.684344 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.693608 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.701469 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-53ad-account-create-update-s9jwg"] Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.716050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-tpwtc"] Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.847205 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-operator-scripts\") pod \"octavia-53ad-account-create-update-s9jwg\" (UID: \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\") " pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.847447 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p7jk\" (UniqueName: \"kubernetes.io/projected/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-kube-api-access-7p7jk\") pod \"octavia-53ad-account-create-update-s9jwg\" (UID: \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\") " pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.948779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-operator-scripts\") pod \"octavia-53ad-account-create-update-s9jwg\" (UID: \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\") " pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.948886 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p7jk\" (UniqueName: \"kubernetes.io/projected/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-kube-api-access-7p7jk\") pod \"octavia-53ad-account-create-update-s9jwg\" (UID: \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\") " pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.950381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-operator-scripts\") pod \"octavia-53ad-account-create-update-s9jwg\" (UID: \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\") " pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:43 crc kubenswrapper[4749]: I1129 02:50:43.967384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p7jk\" (UniqueName: \"kubernetes.io/projected/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-kube-api-access-7p7jk\") pod \"octavia-53ad-account-create-update-s9jwg\" (UID: \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\") " pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:44 crc kubenswrapper[4749]: I1129 02:50:44.111881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:44 crc kubenswrapper[4749]: E1129 02:50:44.147419 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddb95d3b_fab4_4a91_936f_98bfc00dc782.slice/crio-conmon-7e66cacb1b10ff2c01335394b929b95d45b30bbbe57583c1207b7f987d3f79eb.scope\": RecentStats: unable to find data in memory cache]" Nov 29 02:50:44 crc kubenswrapper[4749]: I1129 02:50:44.404922 4749 generic.go:334] "Generic (PLEG): container finished" podID="ddb95d3b-fab4-4a91-936f-98bfc00dc782" containerID="7e66cacb1b10ff2c01335394b929b95d45b30bbbe57583c1207b7f987d3f79eb" exitCode=0 Nov 29 02:50:44 crc kubenswrapper[4749]: I1129 02:50:44.404996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-tpwtc" event={"ID":"ddb95d3b-fab4-4a91-936f-98bfc00dc782","Type":"ContainerDied","Data":"7e66cacb1b10ff2c01335394b929b95d45b30bbbe57583c1207b7f987d3f79eb"} Nov 29 02:50:44 crc kubenswrapper[4749]: I1129 02:50:44.405194 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-tpwtc" event={"ID":"ddb95d3b-fab4-4a91-936f-98bfc00dc782","Type":"ContainerStarted","Data":"1193237cd88378ab14c5d883711eb0ff4b4a06b8190c82989049309e193964ed"} Nov 29 02:50:44 crc kubenswrapper[4749]: I1129 02:50:44.599110 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-53ad-account-create-update-s9jwg"] Nov 29 02:50:45 crc kubenswrapper[4749]: I1129 02:50:45.419781 4749 generic.go:334] "Generic (PLEG): container finished" podID="6eb8a55c-d7cf-4875-9d9e-7760ca068b8d" containerID="7d6c57c39853bb11cf1146ecb503929fbae62b7194c08e1e512055fc5f72f227" exitCode=0 Nov 29 02:50:45 crc kubenswrapper[4749]: I1129 02:50:45.419907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-53ad-account-create-update-s9jwg" event={"ID":"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d","Type":"ContainerDied","Data":"7d6c57c39853bb11cf1146ecb503929fbae62b7194c08e1e512055fc5f72f227"} Nov 29 02:50:45 crc kubenswrapper[4749]: I1129 02:50:45.421347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-53ad-account-create-update-s9jwg" event={"ID":"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d","Type":"ContainerStarted","Data":"20a4abcd9044e3bebff8c13c5c1166b3b1d9d1bb359bbb363eff28c0b7124c83"} Nov 29 02:50:45 crc kubenswrapper[4749]: I1129 02:50:45.856182 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:45 crc kubenswrapper[4749]: I1129 02:50:45.999423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsgpm\" (UniqueName: \"kubernetes.io/projected/ddb95d3b-fab4-4a91-936f-98bfc00dc782-kube-api-access-hsgpm\") pod \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\" (UID: \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\") " Nov 29 02:50:45 crc kubenswrapper[4749]: I1129 02:50:45.999773 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb95d3b-fab4-4a91-936f-98bfc00dc782-operator-scripts\") pod \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\" (UID: \"ddb95d3b-fab4-4a91-936f-98bfc00dc782\") " Nov 29 02:50:46 crc kubenswrapper[4749]: I1129 02:50:46.000487 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddb95d3b-fab4-4a91-936f-98bfc00dc782-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddb95d3b-fab4-4a91-936f-98bfc00dc782" (UID: "ddb95d3b-fab4-4a91-936f-98bfc00dc782"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:50:46 crc kubenswrapper[4749]: I1129 02:50:46.005724 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb95d3b-fab4-4a91-936f-98bfc00dc782-kube-api-access-hsgpm" (OuterVolumeSpecName: "kube-api-access-hsgpm") pod "ddb95d3b-fab4-4a91-936f-98bfc00dc782" (UID: "ddb95d3b-fab4-4a91-936f-98bfc00dc782"). InnerVolumeSpecName "kube-api-access-hsgpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:50:46 crc kubenswrapper[4749]: I1129 02:50:46.102702 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsgpm\" (UniqueName: \"kubernetes.io/projected/ddb95d3b-fab4-4a91-936f-98bfc00dc782-kube-api-access-hsgpm\") on node \"crc\" DevicePath \"\"" Nov 29 02:50:46 crc kubenswrapper[4749]: I1129 02:50:46.102740 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddb95d3b-fab4-4a91-936f-98bfc00dc782-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:50:46 crc kubenswrapper[4749]: I1129 02:50:46.440402 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-tpwtc" Nov 29 02:50:46 crc kubenswrapper[4749]: I1129 02:50:46.440412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-tpwtc" event={"ID":"ddb95d3b-fab4-4a91-936f-98bfc00dc782","Type":"ContainerDied","Data":"1193237cd88378ab14c5d883711eb0ff4b4a06b8190c82989049309e193964ed"} Nov 29 02:50:46 crc kubenswrapper[4749]: I1129 02:50:46.440490 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1193237cd88378ab14c5d883711eb0ff4b4a06b8190c82989049309e193964ed" Nov 29 02:50:46 crc kubenswrapper[4749]: I1129 02:50:46.871719 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.022756 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p7jk\" (UniqueName: \"kubernetes.io/projected/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-kube-api-access-7p7jk\") pod \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\" (UID: \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\") " Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.022806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-operator-scripts\") pod \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\" (UID: \"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d\") " Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.023349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6eb8a55c-d7cf-4875-9d9e-7760ca068b8d" (UID: "6eb8a55c-d7cf-4875-9d9e-7760ca068b8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.034743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-kube-api-access-7p7jk" (OuterVolumeSpecName: "kube-api-access-7p7jk") pod "6eb8a55c-d7cf-4875-9d9e-7760ca068b8d" (UID: "6eb8a55c-d7cf-4875-9d9e-7760ca068b8d"). InnerVolumeSpecName "kube-api-access-7p7jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.125852 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p7jk\" (UniqueName: \"kubernetes.io/projected/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-kube-api-access-7p7jk\") on node \"crc\" DevicePath \"\"" Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.125906 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.458689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-53ad-account-create-update-s9jwg" event={"ID":"6eb8a55c-d7cf-4875-9d9e-7760ca068b8d","Type":"ContainerDied","Data":"20a4abcd9044e3bebff8c13c5c1166b3b1d9d1bb359bbb363eff28c0b7124c83"} Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.458744 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a4abcd9044e3bebff8c13c5c1166b3b1d9d1bb359bbb363eff28c0b7124c83" Nov 29 02:50:47 crc kubenswrapper[4749]: I1129 02:50:47.458760 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-53ad-account-create-update-s9jwg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.148023 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-549d54fc68-lh9xg"] Nov 29 02:50:49 crc kubenswrapper[4749]: E1129 02:50:49.148790 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb95d3b-fab4-4a91-936f-98bfc00dc782" containerName="mariadb-database-create" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.148803 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb95d3b-fab4-4a91-936f-98bfc00dc782" containerName="mariadb-database-create" Nov 29 02:50:49 crc kubenswrapper[4749]: E1129 02:50:49.148824 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb8a55c-d7cf-4875-9d9e-7760ca068b8d" containerName="mariadb-account-create-update" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.148830 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb8a55c-d7cf-4875-9d9e-7760ca068b8d" containerName="mariadb-account-create-update" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.149014 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb95d3b-fab4-4a91-936f-98bfc00dc782" containerName="mariadb-database-create" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.149040 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb8a55c-d7cf-4875-9d9e-7760ca068b8d" containerName="mariadb-account-create-update" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.150384 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.152005 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-bzt94" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.153195 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.153497 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.163233 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-549d54fc68-lh9xg"] Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.278208 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-config-data\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.278364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-combined-ca-bundle\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.278450 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/aabf9b86-851e-47f7-9591-a2526d225e62-octavia-run\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.281153 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-scripts\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.281223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/aabf9b86-851e-47f7-9591-a2526d225e62-config-data-merged\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.384266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-scripts\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.384323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/aabf9b86-851e-47f7-9591-a2526d225e62-config-data-merged\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.384380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-config-data\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.384415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-combined-ca-bundle\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.384447 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/aabf9b86-851e-47f7-9591-a2526d225e62-octavia-run\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.385042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/aabf9b86-851e-47f7-9591-a2526d225e62-config-data-merged\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.385075 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/aabf9b86-851e-47f7-9591-a2526d225e62-octavia-run\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.389665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-combined-ca-bundle\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.389739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-scripts\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.394191 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aabf9b86-851e-47f7-9591-a2526d225e62-config-data\") pod \"octavia-api-549d54fc68-lh9xg\" (UID: \"aabf9b86-851e-47f7-9591-a2526d225e62\") " pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:49 crc kubenswrapper[4749]: I1129 02:50:49.484283 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:50:50 crc kubenswrapper[4749]: I1129 02:50:50.113637 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-549d54fc68-lh9xg"] Nov 29 02:50:50 crc kubenswrapper[4749]: I1129 02:50:50.119706 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 02:50:50 crc kubenswrapper[4749]: I1129 02:50:50.492326 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-549d54fc68-lh9xg" event={"ID":"aabf9b86-851e-47f7-9591-a2526d225e62","Type":"ContainerStarted","Data":"274c0bbd1de4bb1fc620227502393f182483238e94d9077a4ce73ce20b0515b8"} Nov 29 02:50:52 crc kubenswrapper[4749]: I1129 02:50:52.036146 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-27gd8"] Nov 29 02:50:52 crc kubenswrapper[4749]: I1129 02:50:52.047098 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-27gd8"] Nov 29 02:50:52 crc kubenswrapper[4749]: I1129 02:50:52.075453 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:50:52 crc kubenswrapper[4749]: E1129 02:50:52.075717 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:50:53 crc kubenswrapper[4749]: I1129 02:50:53.090746 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c4a187b-e26a-45b8-83d7-a9507f71ac24" path="/var/lib/kubelet/pods/5c4a187b-e26a-45b8-83d7-a9507f71ac24/volumes" Nov 29 02:50:58 crc kubenswrapper[4749]: I1129 02:50:58.575746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-549d54fc68-lh9xg" event={"ID":"aabf9b86-851e-47f7-9591-a2526d225e62","Type":"ContainerStarted","Data":"c1e5595284c7584fc5d9e464ed5bacdd88aefe47e555eac23241d66a278efd48"} Nov 29 02:50:59 crc kubenswrapper[4749]: I1129 02:50:59.589643 4749 generic.go:334] "Generic (PLEG): container finished" podID="aabf9b86-851e-47f7-9591-a2526d225e62" containerID="c1e5595284c7584fc5d9e464ed5bacdd88aefe47e555eac23241d66a278efd48" exitCode=0 Nov 29 02:50:59 crc kubenswrapper[4749]: I1129 02:50:59.589894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-549d54fc68-lh9xg" event={"ID":"aabf9b86-851e-47f7-9591-a2526d225e62","Type":"ContainerDied","Data":"c1e5595284c7584fc5d9e464ed5bacdd88aefe47e555eac23241d66a278efd48"} Nov 29 02:51:00 crc kubenswrapper[4749]: I1129 02:51:00.605614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-549d54fc68-lh9xg" event={"ID":"aabf9b86-851e-47f7-9591-a2526d225e62","Type":"ContainerStarted","Data":"703f6c136830f6951a0c79a15a4e00f9de6846274b63b0de38c87809ca36bc32"} Nov 29 02:51:00 crc kubenswrapper[4749]: I1129 02:51:00.605983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-549d54fc68-lh9xg" event={"ID":"aabf9b86-851e-47f7-9591-a2526d225e62","Type":"ContainerStarted","Data":"b67e7721579c9da26551ab21af8e30dbb6bc22afdbec9f79454ed6071b623c92"} Nov 29 02:51:00 crc kubenswrapper[4749]: I1129 02:51:00.606015 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:51:00 crc kubenswrapper[4749]: I1129 02:51:00.606028 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:51:00 crc kubenswrapper[4749]: I1129 02:51:00.635523 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-549d54fc68-lh9xg" podStartSLOduration=3.514169682 podStartE2EDuration="11.635503963s" podCreationTimestamp="2025-11-29 02:50:49 +0000 UTC" firstStartedPulling="2025-11-29 02:50:50.119471848 +0000 UTC m=+5993.291621705" lastFinishedPulling="2025-11-29 02:50:58.240806109 +0000 UTC m=+6001.412955986" observedRunningTime="2025-11-29 02:51:00.625744595 +0000 UTC m=+6003.797894462" watchObservedRunningTime="2025-11-29 02:51:00.635503963 +0000 UTC m=+6003.807653820" Nov 29 02:51:03 crc kubenswrapper[4749]: I1129 02:51:03.085935 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:51:03 crc kubenswrapper[4749]: I1129 02:51:03.644823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"ab08658feaa017e23d82e44bd49750736188bccc502a12cbbe95310295445311"} Nov 29 02:51:04 crc kubenswrapper[4749]: I1129 02:51:04.190917 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-z2brk" podUID="0cffba40-2751-4ab1-a9b5-9c8c041a83f8" containerName="ovn-controller" probeResult="failure" output=< Nov 29 02:51:04 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 29 02:51:04 crc kubenswrapper[4749]: > Nov 29 02:51:08 crc kubenswrapper[4749]: I1129 02:51:08.131363 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.156448 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-z2brk" podUID="0cffba40-2751-4ab1-a9b5-9c8c041a83f8" containerName="ovn-controller" probeResult="failure" output=< Nov 29 02:51:09 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 29 02:51:09 crc kubenswrapper[4749]: > Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.193540 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.218746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-7bnqw" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.358764 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-z2brk-config-wfc48"] Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.360889 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.363795 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.376402 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-z2brk-config-wfc48"] Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.555314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh78n\" (UniqueName: \"kubernetes.io/projected/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-kube-api-access-kh78n\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.555748 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-scripts\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.555806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-log-ovn\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.555885 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.556014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run-ovn\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.556044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-additional-scripts\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658408 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-scripts\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-log-ovn\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658593 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run-ovn\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-additional-scripts\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658785 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh78n\" (UniqueName: \"kubernetes.io/projected/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-kube-api-access-kh78n\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run-ovn\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.658917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-log-ovn\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.659897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-additional-scripts\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.660886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-scripts\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.695039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh78n\" (UniqueName: \"kubernetes.io/projected/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-kube-api-access-kh78n\") pod \"ovn-controller-z2brk-config-wfc48\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:09 crc kubenswrapper[4749]: I1129 02:51:09.698567 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:10 crc kubenswrapper[4749]: I1129 02:51:10.162858 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-z2brk-config-wfc48"] Nov 29 02:51:10 crc kubenswrapper[4749]: I1129 02:51:10.760629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z2brk-config-wfc48" event={"ID":"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc","Type":"ContainerStarted","Data":"7eca393b6ee4afdbf4866193fffcc844b6569bc01c8a62fc888fd70f3727aa06"} Nov 29 02:51:10 crc kubenswrapper[4749]: I1129 02:51:10.760960 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z2brk-config-wfc48" event={"ID":"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc","Type":"ContainerStarted","Data":"eaacbab8fe4d0b7a20143809b81c92b60fee6d77d03e036b3abb34ad678b2de0"} Nov 29 02:51:10 crc kubenswrapper[4749]: I1129 02:51:10.788274 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-z2brk-config-wfc48" podStartSLOduration=1.788248087 podStartE2EDuration="1.788248087s" podCreationTimestamp="2025-11-29 02:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:51:10.782338083 +0000 UTC m=+6013.954487940" watchObservedRunningTime="2025-11-29 02:51:10.788248087 +0000 UTC m=+6013.960397954" Nov 29 02:51:11 crc kubenswrapper[4749]: I1129 02:51:11.775665 4749 generic.go:334] "Generic (PLEG): container finished" podID="88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" containerID="7eca393b6ee4afdbf4866193fffcc844b6569bc01c8a62fc888fd70f3727aa06" exitCode=0 Nov 29 02:51:11 crc kubenswrapper[4749]: I1129 02:51:11.776143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z2brk-config-wfc48" event={"ID":"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc","Type":"ContainerDied","Data":"7eca393b6ee4afdbf4866193fffcc844b6569bc01c8a62fc888fd70f3727aa06"} Nov 29 02:51:11 crc kubenswrapper[4749]: I1129 02:51:11.852305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-549d54fc68-lh9xg" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.151768 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-82cgw"] Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.154310 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.156933 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.157317 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.157364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.161456 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-82cgw"] Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.161546 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9423ac5b-2562-42f0-b428-7970fead8108-config-data-merged\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.161694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9423ac5b-2562-42f0-b428-7970fead8108-hm-ports\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.161734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9423ac5b-2562-42f0-b428-7970fead8108-config-data\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.161822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9423ac5b-2562-42f0-b428-7970fead8108-scripts\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.241397 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.263112 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9423ac5b-2562-42f0-b428-7970fead8108-hm-ports\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.263156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9423ac5b-2562-42f0-b428-7970fead8108-config-data\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.263220 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9423ac5b-2562-42f0-b428-7970fead8108-scripts\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.263750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9423ac5b-2562-42f0-b428-7970fead8108-config-data-merged\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.264157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9423ac5b-2562-42f0-b428-7970fead8108-config-data-merged\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.264850 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9423ac5b-2562-42f0-b428-7970fead8108-hm-ports\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.270218 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9423ac5b-2562-42f0-b428-7970fead8108-config-data\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.287764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9423ac5b-2562-42f0-b428-7970fead8108-scripts\") pod \"octavia-rsyslog-82cgw\" (UID: \"9423ac5b-2562-42f0-b428-7970fead8108\") " pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.364763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-additional-scripts\") pod \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.364822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh78n\" (UniqueName: \"kubernetes.io/projected/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-kube-api-access-kh78n\") pod \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.364866 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run-ovn\") pod \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365023 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-log-ovn\") pod \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365080 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run\") pod \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-scripts\") pod \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\" (UID: \"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc\") " Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" (UID: "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" (UID: "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365261 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run" (OuterVolumeSpecName: "var-run") pod "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" (UID: "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365551 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365570 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365581 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-var-run\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.365737 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" (UID: "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.366183 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-scripts" (OuterVolumeSpecName: "scripts") pod "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" (UID: "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.368687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-kube-api-access-kh78n" (OuterVolumeSpecName: "kube-api-access-kh78n") pod "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" (UID: "88eae2ed-6184-4cb5-8501-67e8e0d3f1cc"). InnerVolumeSpecName "kube-api-access-kh78n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.467855 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.468480 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.468506 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh78n\" (UniqueName: \"kubernetes.io/projected/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc-kube-api-access-kh78n\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.554084 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.788204 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8kxpd"] Nov 29 02:51:13 crc kubenswrapper[4749]: E1129 02:51:13.788852 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" containerName="ovn-config" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.788867 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" containerName="ovn-config" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.789066 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" containerName="ovn-config" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.791292 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.797071 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.810394 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-z2brk-config-wfc48" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.812537 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8kxpd"] Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.812585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-z2brk-config-wfc48" event={"ID":"88eae2ed-6184-4cb5-8501-67e8e0d3f1cc","Type":"ContainerDied","Data":"eaacbab8fe4d0b7a20143809b81c92b60fee6d77d03e036b3abb34ad678b2de0"} Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.812633 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaacbab8fe4d0b7a20143809b81c92b60fee6d77d03e036b3abb34ad678b2de0" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.915945 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-z2brk-config-wfc48"] Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.923657 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-z2brk-config-wfc48"] Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.977518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-amphora-image\") pod \"octavia-image-upload-59f8cff499-8kxpd\" (UID: \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\") " pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:51:13 crc kubenswrapper[4749]: I1129 02:51:13.977575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-httpd-config\") pod \"octavia-image-upload-59f8cff499-8kxpd\" (UID: \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\") " pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.079853 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-amphora-image\") pod \"octavia-image-upload-59f8cff499-8kxpd\" (UID: \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\") " pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.079903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-httpd-config\") pod \"octavia-image-upload-59f8cff499-8kxpd\" (UID: \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\") " pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.081551 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-amphora-image\") pod \"octavia-image-upload-59f8cff499-8kxpd\" (UID: \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\") " pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.088245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-httpd-config\") pod \"octavia-image-upload-59f8cff499-8kxpd\" (UID: \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\") " pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.129434 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.157775 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-z2brk" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.242798 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-82cgw"] Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.818680 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8kxpd"] Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.832899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-82cgw" event={"ID":"9423ac5b-2562-42f0-b428-7970fead8108","Type":"ContainerStarted","Data":"390ea52a3783e8c3bf6f511bd114ac895464df36f0d1f99868949d44dc0e85ee"} Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.968812 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-zpzjj"] Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.970857 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.973190 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Nov 29 02:51:14 crc kubenswrapper[4749]: I1129 02:51:14.979463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-zpzjj"] Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.093124 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88eae2ed-6184-4cb5-8501-67e8e0d3f1cc" path="/var/lib/kubelet/pods/88eae2ed-6184-4cb5-8501-67e8e0d3f1cc/volumes" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.103269 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.103333 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-combined-ca-bundle\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.103685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-scripts\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.103804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data-merged\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.205605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data-merged\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.205752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.205776 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-combined-ca-bundle\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.205863 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-scripts\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.208186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data-merged\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.215236 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-scripts\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.224051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.235675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-combined-ca-bundle\") pod \"octavia-db-sync-zpzjj\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.305814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.529602 4749 scope.go:117] "RemoveContainer" containerID="f82e00b9bde96e9dff121bf21813ccceda8ddd947f5184c7e247a3e5dddc7c1e" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.656616 4749 scope.go:117] "RemoveContainer" containerID="65f1a59e3ba03494e4d214bee39f739421180a69941f6d32de304948ea5cef79" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.711085 4749 scope.go:117] "RemoveContainer" containerID="f7c0f70b0063879c18dfaa90a5eabb8b3948594c314efa818996c2b7721c34bb" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.821448 4749 scope.go:117] "RemoveContainer" containerID="7f54f9502806954614081ae49fbf1cc5873f5880151030cef00f6f3b5559f2fd" Nov 29 02:51:15 crc kubenswrapper[4749]: I1129 02:51:15.867792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" event={"ID":"b60c4c09-1141-4853-8db0-c2f0ce8f7a41","Type":"ContainerStarted","Data":"da1880da91398e96ae3aa51004bcacd9644de6c1a6a91acfc1586624084ed441"} Nov 29 02:51:16 crc kubenswrapper[4749]: I1129 02:51:16.188778 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-zpzjj"] Nov 29 02:51:16 crc kubenswrapper[4749]: W1129 02:51:16.534858 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf45513c9_263f_4a5b_9cba_e3412e0ac46d.slice/crio-20a4ef3b8fb548af930737fab2fb4a87405e79c5caf8095bec31e6c6661f04ad WatchSource:0}: Error finding container 20a4ef3b8fb548af930737fab2fb4a87405e79c5caf8095bec31e6c6661f04ad: Status 404 returned error can't find the container with id 20a4ef3b8fb548af930737fab2fb4a87405e79c5caf8095bec31e6c6661f04ad Nov 29 02:51:16 crc kubenswrapper[4749]: I1129 02:51:16.885449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-82cgw" event={"ID":"9423ac5b-2562-42f0-b428-7970fead8108","Type":"ContainerStarted","Data":"ba7d34b0d0e4a1836d1863c5239ca46e73ce7ee701ae928e143a5663d3a33f66"} Nov 29 02:51:16 crc kubenswrapper[4749]: I1129 02:51:16.888038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zpzjj" event={"ID":"f45513c9-263f-4a5b-9cba-e3412e0ac46d","Type":"ContainerStarted","Data":"20a4ef3b8fb548af930737fab2fb4a87405e79c5caf8095bec31e6c6661f04ad"} Nov 29 02:51:17 crc kubenswrapper[4749]: I1129 02:51:17.902372 4749 generic.go:334] "Generic (PLEG): container finished" podID="f45513c9-263f-4a5b-9cba-e3412e0ac46d" containerID="23fb169c73ef17503ee8c859a534f7051825dc16f46dac76f6d21c0f650f1620" exitCode=0 Nov 29 02:51:17 crc kubenswrapper[4749]: I1129 02:51:17.902435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zpzjj" event={"ID":"f45513c9-263f-4a5b-9cba-e3412e0ac46d","Type":"ContainerDied","Data":"23fb169c73ef17503ee8c859a534f7051825dc16f46dac76f6d21c0f650f1620"} Nov 29 02:51:18 crc kubenswrapper[4749]: I1129 02:51:18.913774 4749 generic.go:334] "Generic (PLEG): container finished" podID="9423ac5b-2562-42f0-b428-7970fead8108" containerID="ba7d34b0d0e4a1836d1863c5239ca46e73ce7ee701ae928e143a5663d3a33f66" exitCode=0 Nov 29 02:51:18 crc kubenswrapper[4749]: I1129 02:51:18.913828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-82cgw" event={"ID":"9423ac5b-2562-42f0-b428-7970fead8108","Type":"ContainerDied","Data":"ba7d34b0d0e4a1836d1863c5239ca46e73ce7ee701ae928e143a5663d3a33f66"} Nov 29 02:51:19 crc kubenswrapper[4749]: I1129 02:51:19.925697 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zpzjj" event={"ID":"f45513c9-263f-4a5b-9cba-e3412e0ac46d","Type":"ContainerStarted","Data":"62c902af63a7eed153f838a027c15bd5e050f638857a916cc35c50b37d9c7f60"} Nov 29 02:51:19 crc kubenswrapper[4749]: I1129 02:51:19.947893 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-zpzjj" podStartSLOduration=5.947875044 podStartE2EDuration="5.947875044s" podCreationTimestamp="2025-11-29 02:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:51:19.943734163 +0000 UTC m=+6023.115884030" watchObservedRunningTime="2025-11-29 02:51:19.947875044 +0000 UTC m=+6023.120024901" Nov 29 02:51:21 crc kubenswrapper[4749]: I1129 02:51:21.954803 4749 generic.go:334] "Generic (PLEG): container finished" podID="f45513c9-263f-4a5b-9cba-e3412e0ac46d" containerID="62c902af63a7eed153f838a027c15bd5e050f638857a916cc35c50b37d9c7f60" exitCode=0 Nov 29 02:51:21 crc kubenswrapper[4749]: I1129 02:51:21.954943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zpzjj" event={"ID":"f45513c9-263f-4a5b-9cba-e3412e0ac46d","Type":"ContainerDied","Data":"62c902af63a7eed153f838a027c15bd5e050f638857a916cc35c50b37d9c7f60"} Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.129147 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-kx7r9"] Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.132376 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.135537 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.135753 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.139658 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.160010 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-kx7r9"] Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.324664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cecd151e-d11b-4784-95b1-2af60d6017a6-hm-ports\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.325588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-amphora-certs\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.325626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-combined-ca-bundle\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.325660 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-config-data\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.325732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cecd151e-d11b-4784-95b1-2af60d6017a6-config-data-merged\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.325786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-scripts\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.428103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cecd151e-d11b-4784-95b1-2af60d6017a6-hm-ports\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.428165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-amphora-certs\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.428213 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-combined-ca-bundle\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.428242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-config-data\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.428270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cecd151e-d11b-4784-95b1-2af60d6017a6-config-data-merged\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.428310 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-scripts\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.430303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cecd151e-d11b-4784-95b1-2af60d6017a6-config-data-merged\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.430661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/cecd151e-d11b-4784-95b1-2af60d6017a6-hm-ports\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.435953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-combined-ca-bundle\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.436007 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-scripts\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.436833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-config-data\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.437324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/cecd151e-d11b-4784-95b1-2af60d6017a6-amphora-certs\") pod \"octavia-healthmanager-kx7r9\" (UID: \"cecd151e-d11b-4784-95b1-2af60d6017a6\") " pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.464400 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.758650 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.835097 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data-merged\") pod \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.835705 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data\") pod \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.835819 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-scripts\") pod \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.836032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-combined-ca-bundle\") pod \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\" (UID: \"f45513c9-263f-4a5b-9cba-e3412e0ac46d\") " Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.842005 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-scripts" (OuterVolumeSpecName: "scripts") pod "f45513c9-263f-4a5b-9cba-e3412e0ac46d" (UID: "f45513c9-263f-4a5b-9cba-e3412e0ac46d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.854042 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data" (OuterVolumeSpecName: "config-data") pod "f45513c9-263f-4a5b-9cba-e3412e0ac46d" (UID: "f45513c9-263f-4a5b-9cba-e3412e0ac46d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.870128 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f45513c9-263f-4a5b-9cba-e3412e0ac46d" (UID: "f45513c9-263f-4a5b-9cba-e3412e0ac46d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.877311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "f45513c9-263f-4a5b-9cba-e3412e0ac46d" (UID: "f45513c9-263f-4a5b-9cba-e3412e0ac46d"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.939033 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.939068 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.939077 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f45513c9-263f-4a5b-9cba-e3412e0ac46d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.939086 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f45513c9-263f-4a5b-9cba-e3412e0ac46d-config-data-merged\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.979321 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-zpzjj" event={"ID":"f45513c9-263f-4a5b-9cba-e3412e0ac46d","Type":"ContainerDied","Data":"20a4ef3b8fb548af930737fab2fb4a87405e79c5caf8095bec31e6c6661f04ad"} Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.979358 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a4ef3b8fb548af930737fab2fb4a87405e79c5caf8095bec31e6c6661f04ad" Nov 29 02:51:23 crc kubenswrapper[4749]: I1129 02:51:23.979416 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-zpzjj" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.565895 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-kx7r9"] Nov 29 02:51:24 crc kubenswrapper[4749]: W1129 02:51:24.577398 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcecd151e_d11b_4784_95b1_2af60d6017a6.slice/crio-2fcfdcc34ae4b5020889baf3737de4497e7777d5c9e9b4502b1ad6be9edb3120 WatchSource:0}: Error finding container 2fcfdcc34ae4b5020889baf3737de4497e7777d5c9e9b4502b1ad6be9edb3120: Status 404 returned error can't find the container with id 2fcfdcc34ae4b5020889baf3737de4497e7777d5c9e9b4502b1ad6be9edb3120 Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.677234 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-n672w"] Nov 29 02:51:24 crc kubenswrapper[4749]: E1129 02:51:24.677651 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f45513c9-263f-4a5b-9cba-e3412e0ac46d" containerName="init" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.677666 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f45513c9-263f-4a5b-9cba-e3412e0ac46d" containerName="init" Nov 29 02:51:24 crc kubenswrapper[4749]: E1129 02:51:24.677709 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f45513c9-263f-4a5b-9cba-e3412e0ac46d" containerName="octavia-db-sync" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.677716 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f45513c9-263f-4a5b-9cba-e3412e0ac46d" containerName="octavia-db-sync" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.677901 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f45513c9-263f-4a5b-9cba-e3412e0ac46d" containerName="octavia-db-sync" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.678946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.681690 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.682635 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.688505 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-n672w"] Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.857771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-amphora-certs\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.857847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-combined-ca-bundle\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.857942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-hm-ports\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.857967 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-scripts\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.858025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-config-data-merged\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.858048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-config-data\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.961568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-amphora-certs\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.962023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-combined-ca-bundle\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.962137 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-hm-ports\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.962175 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-scripts\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.962285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-config-data-merged\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.962321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-config-data\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.963096 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-config-data-merged\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.964876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-hm-ports\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.968055 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-scripts\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.968809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-amphora-certs\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.969368 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-config-data\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:24 crc kubenswrapper[4749]: I1129 02:51:24.971368 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c-combined-ca-bundle\") pod \"octavia-housekeeping-n672w\" (UID: \"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c\") " pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:25 crc kubenswrapper[4749]: I1129 02:51:25.003116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" event={"ID":"b60c4c09-1141-4853-8db0-c2f0ce8f7a41","Type":"ContainerStarted","Data":"8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a"} Nov 29 02:51:25 crc kubenswrapper[4749]: I1129 02:51:25.007537 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-kx7r9" event={"ID":"cecd151e-d11b-4784-95b1-2af60d6017a6","Type":"ContainerStarted","Data":"2fcfdcc34ae4b5020889baf3737de4497e7777d5c9e9b4502b1ad6be9edb3120"} Nov 29 02:51:25 crc kubenswrapper[4749]: I1129 02:51:25.011870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-82cgw" event={"ID":"9423ac5b-2562-42f0-b428-7970fead8108","Type":"ContainerStarted","Data":"a98ac945f637f0d21af33429da30247a91c4a7fa9350a5c9fec381d5fd4cfa5e"} Nov 29 02:51:25 crc kubenswrapper[4749]: I1129 02:51:25.012693 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:25 crc kubenswrapper[4749]: I1129 02:51:25.043798 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:25 crc kubenswrapper[4749]: I1129 02:51:25.078665 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-82cgw" podStartSLOduration=1.982204151 podStartE2EDuration="12.078640667s" podCreationTimestamp="2025-11-29 02:51:13 +0000 UTC" firstStartedPulling="2025-11-29 02:51:14.273705191 +0000 UTC m=+6017.445855048" lastFinishedPulling="2025-11-29 02:51:24.370141687 +0000 UTC m=+6027.542291564" observedRunningTime="2025-11-29 02:51:25.065368975 +0000 UTC m=+6028.237518892" watchObservedRunningTime="2025-11-29 02:51:25.078640667 +0000 UTC m=+6028.250790544" Nov 29 02:51:25 crc kubenswrapper[4749]: W1129 02:51:25.652052 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91dc96dc_0ef3_4fd5_b324_016cd3d1ba4c.slice/crio-39e0fdcee4e5bb59294064c5fc98a64031a1713fa43779ec2784a723621f656b WatchSource:0}: Error finding container 39e0fdcee4e5bb59294064c5fc98a64031a1713fa43779ec2784a723621f656b: Status 404 returned error can't find the container with id 39e0fdcee4e5bb59294064c5fc98a64031a1713fa43779ec2784a723621f656b Nov 29 02:51:25 crc kubenswrapper[4749]: I1129 02:51:25.653715 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-n672w"] Nov 29 02:51:26 crc kubenswrapper[4749]: I1129 02:51:26.024190 4749 generic.go:334] "Generic (PLEG): container finished" podID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" containerID="8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a" exitCode=0 Nov 29 02:51:26 crc kubenswrapper[4749]: I1129 02:51:26.024270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" event={"ID":"b60c4c09-1141-4853-8db0-c2f0ce8f7a41","Type":"ContainerDied","Data":"8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a"} Nov 29 02:51:26 crc kubenswrapper[4749]: I1129 02:51:26.026292 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-n672w" event={"ID":"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c","Type":"ContainerStarted","Data":"39e0fdcee4e5bb59294064c5fc98a64031a1713fa43779ec2784a723621f656b"} Nov 29 02:51:26 crc kubenswrapper[4749]: I1129 02:51:26.029043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-kx7r9" event={"ID":"cecd151e-d11b-4784-95b1-2af60d6017a6","Type":"ContainerStarted","Data":"73857ba310cec6eaf646479bfaa86e1b83b2546e6ed9e31a3086b7c2fb450a0c"} Nov 29 02:51:27 crc kubenswrapper[4749]: I1129 02:51:27.047332 4749 generic.go:334] "Generic (PLEG): container finished" podID="cecd151e-d11b-4784-95b1-2af60d6017a6" containerID="73857ba310cec6eaf646479bfaa86e1b83b2546e6ed9e31a3086b7c2fb450a0c" exitCode=0 Nov 29 02:51:27 crc kubenswrapper[4749]: I1129 02:51:27.047507 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-kx7r9" event={"ID":"cecd151e-d11b-4784-95b1-2af60d6017a6","Type":"ContainerDied","Data":"73857ba310cec6eaf646479bfaa86e1b83b2546e6ed9e31a3086b7c2fb450a0c"} Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.579778 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-xddrk"] Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.582826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.585606 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.586171 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.609030 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-xddrk"] Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.665121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-hm-ports\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.665174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-amphora-certs\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.665375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-config-data\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.665783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-combined-ca-bundle\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.665965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-scripts\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.666011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-config-data-merged\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.767804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-hm-ports\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.767879 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-amphora-certs\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.767919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-config-data\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.768017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-combined-ca-bundle\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.768086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-scripts\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.768121 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-config-data-merged\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.768692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-config-data-merged\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.768808 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-hm-ports\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.773924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-amphora-certs\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.774004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-combined-ca-bundle\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.775082 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-scripts\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.775681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f261d77c-0a22-41ae-9b6f-7a43382b8ca8-config-data\") pod \"octavia-worker-xddrk\" (UID: \"f261d77c-0a22-41ae-9b6f-7a43382b8ca8\") " pod="openstack/octavia-worker-xddrk" Nov 29 02:51:28 crc kubenswrapper[4749]: I1129 02:51:28.902915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-xddrk" Nov 29 02:51:29 crc kubenswrapper[4749]: I1129 02:51:29.116311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-n672w" event={"ID":"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c","Type":"ContainerStarted","Data":"2db42ece24a8cfd4525ccfc30b1b3cbce48776a7abe8c03d39fff2ba4a5c2275"} Nov 29 02:51:29 crc kubenswrapper[4749]: I1129 02:51:29.129166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-kx7r9" event={"ID":"cecd151e-d11b-4784-95b1-2af60d6017a6","Type":"ContainerStarted","Data":"882a0274ffa778c1e53afe91aafe8bc6c903c07dcbc70f008a2aa4a2faae354e"} Nov 29 02:51:29 crc kubenswrapper[4749]: I1129 02:51:29.129632 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:29 crc kubenswrapper[4749]: I1129 02:51:29.157935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" event={"ID":"b60c4c09-1141-4853-8db0-c2f0ce8f7a41","Type":"ContainerStarted","Data":"767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b"} Nov 29 02:51:29 crc kubenswrapper[4749]: I1129 02:51:29.183180 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-kx7r9" podStartSLOduration=6.183131018 podStartE2EDuration="6.183131018s" podCreationTimestamp="2025-11-29 02:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:51:29.170344387 +0000 UTC m=+6032.342494244" watchObservedRunningTime="2025-11-29 02:51:29.183131018 +0000 UTC m=+6032.355280875" Nov 29 02:51:29 crc kubenswrapper[4749]: I1129 02:51:29.472164 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" podStartSLOduration=3.310968029 podStartE2EDuration="16.472143753s" podCreationTimestamp="2025-11-29 02:51:13 +0000 UTC" firstStartedPulling="2025-11-29 02:51:14.831369216 +0000 UTC m=+6018.003519073" lastFinishedPulling="2025-11-29 02:51:27.99254494 +0000 UTC m=+6031.164694797" observedRunningTime="2025-11-29 02:51:29.221846309 +0000 UTC m=+6032.393996176" watchObservedRunningTime="2025-11-29 02:51:29.472143753 +0000 UTC m=+6032.644293620" Nov 29 02:51:29 crc kubenswrapper[4749]: I1129 02:51:29.481849 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-xddrk"] Nov 29 02:51:29 crc kubenswrapper[4749]: W1129 02:51:29.490003 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf261d77c_0a22_41ae_9b6f_7a43382b8ca8.slice/crio-963216f58e32aa8bd08a48522476b0eb41fb1e2045c8679b37a4134021469e1d WatchSource:0}: Error finding container 963216f58e32aa8bd08a48522476b0eb41fb1e2045c8679b37a4134021469e1d: Status 404 returned error can't find the container with id 963216f58e32aa8bd08a48522476b0eb41fb1e2045c8679b37a4134021469e1d Nov 29 02:51:30 crc kubenswrapper[4749]: I1129 02:51:30.167638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-xddrk" event={"ID":"f261d77c-0a22-41ae-9b6f-7a43382b8ca8","Type":"ContainerStarted","Data":"963216f58e32aa8bd08a48522476b0eb41fb1e2045c8679b37a4134021469e1d"} Nov 29 02:51:31 crc kubenswrapper[4749]: I1129 02:51:31.182289 4749 generic.go:334] "Generic (PLEG): container finished" podID="91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c" containerID="2db42ece24a8cfd4525ccfc30b1b3cbce48776a7abe8c03d39fff2ba4a5c2275" exitCode=0 Nov 29 02:51:31 crc kubenswrapper[4749]: I1129 02:51:31.182398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-n672w" event={"ID":"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c","Type":"ContainerDied","Data":"2db42ece24a8cfd4525ccfc30b1b3cbce48776a7abe8c03d39fff2ba4a5c2275"} Nov 29 02:51:32 crc kubenswrapper[4749]: I1129 02:51:32.197187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-n672w" event={"ID":"91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c","Type":"ContainerStarted","Data":"340487d7f9628bc520786da497b036e5c4c2f7dc9983b207bfe6302e3208002b"} Nov 29 02:51:32 crc kubenswrapper[4749]: I1129 02:51:32.197932 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:32 crc kubenswrapper[4749]: I1129 02:51:32.199919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-xddrk" event={"ID":"f261d77c-0a22-41ae-9b6f-7a43382b8ca8","Type":"ContainerStarted","Data":"2e91b6afd00428028a86eafa1f6022b37561187f76d3816dd9b34c6c295a1bfc"} Nov 29 02:51:32 crc kubenswrapper[4749]: I1129 02:51:32.226929 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-n672w" podStartSLOduration=5.892568541 podStartE2EDuration="8.226899247s" podCreationTimestamp="2025-11-29 02:51:24 +0000 UTC" firstStartedPulling="2025-11-29 02:51:25.654071874 +0000 UTC m=+6028.826221731" lastFinishedPulling="2025-11-29 02:51:27.98840258 +0000 UTC m=+6031.160552437" observedRunningTime="2025-11-29 02:51:32.219680292 +0000 UTC m=+6035.391830169" watchObservedRunningTime="2025-11-29 02:51:32.226899247 +0000 UTC m=+6035.399049154" Nov 29 02:51:33 crc kubenswrapper[4749]: I1129 02:51:33.214482 4749 generic.go:334] "Generic (PLEG): container finished" podID="f261d77c-0a22-41ae-9b6f-7a43382b8ca8" containerID="2e91b6afd00428028a86eafa1f6022b37561187f76d3816dd9b34c6c295a1bfc" exitCode=0 Nov 29 02:51:33 crc kubenswrapper[4749]: I1129 02:51:33.216796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-xddrk" event={"ID":"f261d77c-0a22-41ae-9b6f-7a43382b8ca8","Type":"ContainerDied","Data":"2e91b6afd00428028a86eafa1f6022b37561187f76d3816dd9b34c6c295a1bfc"} Nov 29 02:51:34 crc kubenswrapper[4749]: I1129 02:51:34.227594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-xddrk" event={"ID":"f261d77c-0a22-41ae-9b6f-7a43382b8ca8","Type":"ContainerStarted","Data":"c0548d7b0773d8eedd14a0387ad019206a2f63df985dbc6a50a6323532c64079"} Nov 29 02:51:34 crc kubenswrapper[4749]: I1129 02:51:34.228705 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-xddrk" Nov 29 02:51:34 crc kubenswrapper[4749]: I1129 02:51:34.264432 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-xddrk" podStartSLOduration=4.6556878600000005 podStartE2EDuration="6.26440599s" podCreationTimestamp="2025-11-29 02:51:28 +0000 UTC" firstStartedPulling="2025-11-29 02:51:29.492469237 +0000 UTC m=+6032.664619084" lastFinishedPulling="2025-11-29 02:51:31.101187347 +0000 UTC m=+6034.273337214" observedRunningTime="2025-11-29 02:51:34.250052921 +0000 UTC m=+6037.422202788" watchObservedRunningTime="2025-11-29 02:51:34.26440599 +0000 UTC m=+6037.436555857" Nov 29 02:51:38 crc kubenswrapper[4749]: I1129 02:51:38.516745 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-kx7r9" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.088268 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-n672w" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.726763 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vrxfc"] Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.729760 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.738389 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vrxfc"] Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.890484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-utilities\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.890933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-catalog-content\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.890986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t79qb\" (UniqueName: \"kubernetes.io/projected/069b657d-edcc-48af-82af-ef6fcfd99030-kube-api-access-t79qb\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.992478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-catalog-content\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.992537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t79qb\" (UniqueName: \"kubernetes.io/projected/069b657d-edcc-48af-82af-ef6fcfd99030-kube-api-access-t79qb\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.992674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-utilities\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.993148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-utilities\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:40 crc kubenswrapper[4749]: I1129 02:51:40.993465 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-catalog-content\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:41 crc kubenswrapper[4749]: I1129 02:51:41.017945 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t79qb\" (UniqueName: \"kubernetes.io/projected/069b657d-edcc-48af-82af-ef6fcfd99030-kube-api-access-t79qb\") pod \"community-operators-vrxfc\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:41 crc kubenswrapper[4749]: I1129 02:51:41.127734 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:41 crc kubenswrapper[4749]: I1129 02:51:41.643318 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vrxfc"] Nov 29 02:51:42 crc kubenswrapper[4749]: I1129 02:51:42.345018 4749 generic.go:334] "Generic (PLEG): container finished" podID="069b657d-edcc-48af-82af-ef6fcfd99030" containerID="b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2" exitCode=0 Nov 29 02:51:42 crc kubenswrapper[4749]: I1129 02:51:42.345850 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrxfc" event={"ID":"069b657d-edcc-48af-82af-ef6fcfd99030","Type":"ContainerDied","Data":"b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2"} Nov 29 02:51:42 crc kubenswrapper[4749]: I1129 02:51:42.347085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrxfc" event={"ID":"069b657d-edcc-48af-82af-ef6fcfd99030","Type":"ContainerStarted","Data":"6b1b144e4021d96b0cf7fbfdfd55395b4bd860595e73becc7fd6ef881755abd2"} Nov 29 02:51:43 crc kubenswrapper[4749]: I1129 02:51:43.594739 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-82cgw" Nov 29 02:51:43 crc kubenswrapper[4749]: I1129 02:51:43.953805 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-xddrk" Nov 29 02:51:44 crc kubenswrapper[4749]: I1129 02:51:44.368313 4749 generic.go:334] "Generic (PLEG): container finished" podID="069b657d-edcc-48af-82af-ef6fcfd99030" containerID="36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050" exitCode=0 Nov 29 02:51:44 crc kubenswrapper[4749]: I1129 02:51:44.368397 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrxfc" event={"ID":"069b657d-edcc-48af-82af-ef6fcfd99030","Type":"ContainerDied","Data":"36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050"} Nov 29 02:51:45 crc kubenswrapper[4749]: I1129 02:51:45.378745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrxfc" event={"ID":"069b657d-edcc-48af-82af-ef6fcfd99030","Type":"ContainerStarted","Data":"2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f"} Nov 29 02:51:45 crc kubenswrapper[4749]: I1129 02:51:45.397059 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vrxfc" podStartSLOduration=2.80377595 podStartE2EDuration="5.397034461s" podCreationTimestamp="2025-11-29 02:51:40 +0000 UTC" firstStartedPulling="2025-11-29 02:51:42.346762002 +0000 UTC m=+6045.518911859" lastFinishedPulling="2025-11-29 02:51:44.940020503 +0000 UTC m=+6048.112170370" observedRunningTime="2025-11-29 02:51:45.396289253 +0000 UTC m=+6048.568439130" watchObservedRunningTime="2025-11-29 02:51:45.397034461 +0000 UTC m=+6048.569184358" Nov 29 02:51:51 crc kubenswrapper[4749]: I1129 02:51:51.128647 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:51 crc kubenswrapper[4749]: I1129 02:51:51.129233 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:51 crc kubenswrapper[4749]: I1129 02:51:51.194058 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:51 crc kubenswrapper[4749]: I1129 02:51:51.490615 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:52 crc kubenswrapper[4749]: I1129 02:51:52.663928 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vrxfc"] Nov 29 02:51:53 crc kubenswrapper[4749]: I1129 02:51:53.470467 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vrxfc" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" containerName="registry-server" containerID="cri-o://2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f" gracePeriod=2 Nov 29 02:51:53 crc kubenswrapper[4749]: I1129 02:51:53.997896 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.123313 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-catalog-content\") pod \"069b657d-edcc-48af-82af-ef6fcfd99030\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.123371 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t79qb\" (UniqueName: \"kubernetes.io/projected/069b657d-edcc-48af-82af-ef6fcfd99030-kube-api-access-t79qb\") pod \"069b657d-edcc-48af-82af-ef6fcfd99030\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.123512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-utilities\") pod \"069b657d-edcc-48af-82af-ef6fcfd99030\" (UID: \"069b657d-edcc-48af-82af-ef6fcfd99030\") " Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.124582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-utilities" (OuterVolumeSpecName: "utilities") pod "069b657d-edcc-48af-82af-ef6fcfd99030" (UID: "069b657d-edcc-48af-82af-ef6fcfd99030"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.141585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069b657d-edcc-48af-82af-ef6fcfd99030-kube-api-access-t79qb" (OuterVolumeSpecName: "kube-api-access-t79qb") pod "069b657d-edcc-48af-82af-ef6fcfd99030" (UID: "069b657d-edcc-48af-82af-ef6fcfd99030"). InnerVolumeSpecName "kube-api-access-t79qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.191318 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "069b657d-edcc-48af-82af-ef6fcfd99030" (UID: "069b657d-edcc-48af-82af-ef6fcfd99030"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.226688 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.226721 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/069b657d-edcc-48af-82af-ef6fcfd99030-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.226733 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t79qb\" (UniqueName: \"kubernetes.io/projected/069b657d-edcc-48af-82af-ef6fcfd99030-kube-api-access-t79qb\") on node \"crc\" DevicePath \"\"" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.482480 4749 generic.go:334] "Generic (PLEG): container finished" podID="069b657d-edcc-48af-82af-ef6fcfd99030" containerID="2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f" exitCode=0 Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.482529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrxfc" event={"ID":"069b657d-edcc-48af-82af-ef6fcfd99030","Type":"ContainerDied","Data":"2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f"} Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.482572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vrxfc" event={"ID":"069b657d-edcc-48af-82af-ef6fcfd99030","Type":"ContainerDied","Data":"6b1b144e4021d96b0cf7fbfdfd55395b4bd860595e73becc7fd6ef881755abd2"} Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.482595 4749 scope.go:117] "RemoveContainer" containerID="2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.483471 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vrxfc" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.526954 4749 scope.go:117] "RemoveContainer" containerID="36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.558335 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vrxfc"] Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.568632 4749 scope.go:117] "RemoveContainer" containerID="b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.574543 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vrxfc"] Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.618272 4749 scope.go:117] "RemoveContainer" containerID="2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f" Nov 29 02:51:54 crc kubenswrapper[4749]: E1129 02:51:54.618966 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f\": container with ID starting with 2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f not found: ID does not exist" containerID="2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.619018 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f"} err="failed to get container status \"2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f\": rpc error: code = NotFound desc = could not find container \"2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f\": container with ID starting with 2eda2264faa9da62457a63e42d53e6602aaf219c2745fe713f611c88c4180f3f not found: ID does not exist" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.619051 4749 scope.go:117] "RemoveContainer" containerID="36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050" Nov 29 02:51:54 crc kubenswrapper[4749]: E1129 02:51:54.619575 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050\": container with ID starting with 36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050 not found: ID does not exist" containerID="36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.619751 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050"} err="failed to get container status \"36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050\": rpc error: code = NotFound desc = could not find container \"36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050\": container with ID starting with 36219a0448bc7bdc5156d95906c417b2f0303677a16056ba568a7e7d08c19050 not found: ID does not exist" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.619788 4749 scope.go:117] "RemoveContainer" containerID="b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2" Nov 29 02:51:54 crc kubenswrapper[4749]: E1129 02:51:54.620599 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2\": container with ID starting with b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2 not found: ID does not exist" containerID="b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2" Nov 29 02:51:54 crc kubenswrapper[4749]: I1129 02:51:54.620672 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2"} err="failed to get container status \"b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2\": rpc error: code = NotFound desc = could not find container \"b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2\": container with ID starting with b49a54bb558172c55bf44c7f60599316f54558db23238c9d1c31d093e274def2 not found: ID does not exist" Nov 29 02:51:55 crc kubenswrapper[4749]: I1129 02:51:55.089884 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" path="/var/lib/kubelet/pods/069b657d-edcc-48af-82af-ef6fcfd99030/volumes" Nov 29 02:52:02 crc kubenswrapper[4749]: I1129 02:52:02.693996 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8kxpd"] Nov 29 02:52:02 crc kubenswrapper[4749]: I1129 02:52:02.695029 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" podUID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" containerName="octavia-amphora-httpd" containerID="cri-o://767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b" gracePeriod=30 Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.337167 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.463909 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-httpd-config\") pod \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\" (UID: \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\") " Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.464111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-amphora-image\") pod \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\" (UID: \"b60c4c09-1141-4853-8db0-c2f0ce8f7a41\") " Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.509796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b60c4c09-1141-4853-8db0-c2f0ce8f7a41" (UID: "b60c4c09-1141-4853-8db0-c2f0ce8f7a41"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.551231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "b60c4c09-1141-4853-8db0-c2f0ce8f7a41" (UID: "b60c4c09-1141-4853-8db0-c2f0ce8f7a41"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.566803 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.566846 4749 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/b60c4c09-1141-4853-8db0-c2f0ce8f7a41-amphora-image\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.585298 4749 generic.go:334] "Generic (PLEG): container finished" podID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" containerID="767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b" exitCode=0 Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.585353 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" event={"ID":"b60c4c09-1141-4853-8db0-c2f0ce8f7a41","Type":"ContainerDied","Data":"767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b"} Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.585382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" event={"ID":"b60c4c09-1141-4853-8db0-c2f0ce8f7a41","Type":"ContainerDied","Data":"da1880da91398e96ae3aa51004bcacd9644de6c1a6a91acfc1586624084ed441"} Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.585399 4749 scope.go:117] "RemoveContainer" containerID="767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.585515 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-8kxpd" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.630969 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8kxpd"] Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.638795 4749 scope.go:117] "RemoveContainer" containerID="8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.641939 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8kxpd"] Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.670142 4749 scope.go:117] "RemoveContainer" containerID="767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b" Nov 29 02:52:03 crc kubenswrapper[4749]: E1129 02:52:03.670888 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b\": container with ID starting with 767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b not found: ID does not exist" containerID="767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.670929 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b"} err="failed to get container status \"767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b\": rpc error: code = NotFound desc = could not find container \"767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b\": container with ID starting with 767d96e535d786d62bca3cc126917d2bd2487a1c69735bb6361c10b7a907772b not found: ID does not exist" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.670954 4749 scope.go:117] "RemoveContainer" containerID="8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a" Nov 29 02:52:03 crc kubenswrapper[4749]: E1129 02:52:03.672369 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a\": container with ID starting with 8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a not found: ID does not exist" containerID="8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a" Nov 29 02:52:03 crc kubenswrapper[4749]: I1129 02:52:03.672405 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a"} err="failed to get container status \"8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a\": rpc error: code = NotFound desc = could not find container \"8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a\": container with ID starting with 8a7256c27c0a14d4d544245e0ccf9bb3d1c579614b5913824229693613a2760a not found: ID does not exist" Nov 29 02:52:05 crc kubenswrapper[4749]: I1129 02:52:05.088469 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" path="/var/lib/kubelet/pods/b60c4c09-1141-4853-8db0-c2f0ce8f7a41/volumes" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.565375 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-87bb96969-7cm75"] Nov 29 02:52:45 crc kubenswrapper[4749]: E1129 02:52:45.567288 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" containerName="extract-content" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.567344 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" containerName="extract-content" Nov 29 02:52:45 crc kubenswrapper[4749]: E1129 02:52:45.567375 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" containerName="registry-server" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.567390 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" containerName="registry-server" Nov 29 02:52:45 crc kubenswrapper[4749]: E1129 02:52:45.567418 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" containerName="init" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.567431 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" containerName="init" Nov 29 02:52:45 crc kubenswrapper[4749]: E1129 02:52:45.567458 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" containerName="extract-utilities" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.567471 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" containerName="extract-utilities" Nov 29 02:52:45 crc kubenswrapper[4749]: E1129 02:52:45.567498 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" containerName="octavia-amphora-httpd" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.567510 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" containerName="octavia-amphora-httpd" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.567897 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60c4c09-1141-4853-8db0-c2f0ce8f7a41" containerName="octavia-amphora-httpd" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.567949 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="069b657d-edcc-48af-82af-ef6fcfd99030" containerName="registry-server" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.569860 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.573957 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.574179 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.574504 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-qqk8d" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.574645 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.587194 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-87bb96969-7cm75"] Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.613989 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.614637 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-log" containerID="cri-o://bc21b65a113c4ec0128958fd2424033a7bf8824181a1d2f04d4c0311ea08185a" gracePeriod=30 Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.615061 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-httpd" containerID="cri-o://86c570737b7092e8f37dab2c41692f1a45efbe0d3366fb17c0d1f4ebc4e850fa" gracePeriod=30 Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.668308 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67fd97fcb5-fp48v"] Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.670518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.689536 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67fd97fcb5-fp48v"] Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.702347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76314112-321f-424a-9b69-08330318ec1f-horizon-secret-key\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.702414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76314112-321f-424a-9b69-08330318ec1f-logs\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.702453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-config-data\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.702509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-scripts\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.702548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snh7t\" (UniqueName: \"kubernetes.io/projected/76314112-321f-424a-9b69-08330318ec1f-kube-api-access-snh7t\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.709486 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.709718 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-log" containerID="cri-o://dcfa7565324a18c1ef573fc7854d07f9c9a0fe825fcaa9951a352bff69ecb5de" gracePeriod=30 Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.709864 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-httpd" containerID="cri-o://b4b474e25fca289df9a0f5c7b40be14890f107ba84fd7a0285968d84847ff1f4" gracePeriod=30 Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.804252 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-config-data\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.804534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlvg7\" (UniqueName: \"kubernetes.io/projected/66b06843-2245-4475-b1dc-89f6015a10dc-kube-api-access-rlvg7\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.804644 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-scripts\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.804757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-scripts\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.804850 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b06843-2245-4475-b1dc-89f6015a10dc-logs\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.804953 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66b06843-2245-4475-b1dc-89f6015a10dc-horizon-secret-key\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.805047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snh7t\" (UniqueName: \"kubernetes.io/projected/76314112-321f-424a-9b69-08330318ec1f-kube-api-access-snh7t\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.805151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76314112-321f-424a-9b69-08330318ec1f-horizon-secret-key\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.805268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-config-data\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.805377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76314112-321f-424a-9b69-08330318ec1f-logs\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.805314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-scripts\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.805775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76314112-321f-424a-9b69-08330318ec1f-logs\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.817321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76314112-321f-424a-9b69-08330318ec1f-horizon-secret-key\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.818848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-config-data\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.835480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snh7t\" (UniqueName: \"kubernetes.io/projected/76314112-321f-424a-9b69-08330318ec1f-kube-api-access-snh7t\") pod \"horizon-87bb96969-7cm75\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.909629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b06843-2245-4475-b1dc-89f6015a10dc-logs\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.909708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66b06843-2245-4475-b1dc-89f6015a10dc-horizon-secret-key\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.909796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-config-data\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.909883 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlvg7\" (UniqueName: \"kubernetes.io/projected/66b06843-2245-4475-b1dc-89f6015a10dc-kube-api-access-rlvg7\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.909926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-scripts\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.911084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-scripts\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.911933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-config-data\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.912397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b06843-2245-4475-b1dc-89f6015a10dc-logs\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.912625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.918531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66b06843-2245-4475-b1dc-89f6015a10dc-horizon-secret-key\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:45 crc kubenswrapper[4749]: I1129 02:52:45.943792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlvg7\" (UniqueName: \"kubernetes.io/projected/66b06843-2245-4475-b1dc-89f6015a10dc-kube-api-access-rlvg7\") pod \"horizon-67fd97fcb5-fp48v\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.001669 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.114731 4749 generic.go:334] "Generic (PLEG): container finished" podID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerID="bc21b65a113c4ec0128958fd2424033a7bf8824181a1d2f04d4c0311ea08185a" exitCode=143 Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.114841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"073b63e6-4340-476c-bccb-23bbd6c43cf9","Type":"ContainerDied","Data":"bc21b65a113c4ec0128958fd2424033a7bf8824181a1d2f04d4c0311ea08185a"} Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.117241 4749 generic.go:334] "Generic (PLEG): container finished" podID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerID="dcfa7565324a18c1ef573fc7854d07f9c9a0fe825fcaa9951a352bff69ecb5de" exitCode=143 Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.117267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"282ac302-d5c0-4c95-b5d5-708cbaa5fc18","Type":"ContainerDied","Data":"dcfa7565324a18c1ef573fc7854d07f9c9a0fe825fcaa9951a352bff69ecb5de"} Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.261952 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67fd97fcb5-fp48v"] Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.280769 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7849bb687c-m7rzj"] Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.283183 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.305291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7849bb687c-m7rzj"] Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.423765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57cb1c71-97d0-4742-be6b-4ce763c9b51d-logs\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.423844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t756l\" (UniqueName: \"kubernetes.io/projected/57cb1c71-97d0-4742-be6b-4ce763c9b51d-kube-api-access-t756l\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.424053 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-scripts\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.424218 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/57cb1c71-97d0-4742-be6b-4ce763c9b51d-horizon-secret-key\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.424265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-config-data\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.526165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-scripts\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.526242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/57cb1c71-97d0-4742-be6b-4ce763c9b51d-horizon-secret-key\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.526266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-config-data\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.526362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57cb1c71-97d0-4742-be6b-4ce763c9b51d-logs\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.526396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t756l\" (UniqueName: \"kubernetes.io/projected/57cb1c71-97d0-4742-be6b-4ce763c9b51d-kube-api-access-t756l\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.527712 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57cb1c71-97d0-4742-be6b-4ce763c9b51d-logs\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.527799 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-config-data\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.527840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-scripts\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.529160 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-87bb96969-7cm75"] Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.533762 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/57cb1c71-97d0-4742-be6b-4ce763c9b51d-horizon-secret-key\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: W1129 02:52:46.537742 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76314112_321f_424a_9b69_08330318ec1f.slice/crio-a97a7a4864f2dc5b7f6fbf8b5b1b64d90042c35195e332692dffb58436359a77 WatchSource:0}: Error finding container a97a7a4864f2dc5b7f6fbf8b5b1b64d90042c35195e332692dffb58436359a77: Status 404 returned error can't find the container with id a97a7a4864f2dc5b7f6fbf8b5b1b64d90042c35195e332692dffb58436359a77 Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.542914 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t756l\" (UniqueName: \"kubernetes.io/projected/57cb1c71-97d0-4742-be6b-4ce763c9b51d-kube-api-access-t756l\") pod \"horizon-7849bb687c-m7rzj\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.619944 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:46 crc kubenswrapper[4749]: I1129 02:52:46.622255 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67fd97fcb5-fp48v"] Nov 29 02:52:47 crc kubenswrapper[4749]: I1129 02:52:47.130126 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7849bb687c-m7rzj"] Nov 29 02:52:47 crc kubenswrapper[4749]: W1129 02:52:47.135416 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57cb1c71_97d0_4742_be6b_4ce763c9b51d.slice/crio-3f60ac14f4a73af351a1acc42ab7985c4899a60598dbce58afe155de18b42469 WatchSource:0}: Error finding container 3f60ac14f4a73af351a1acc42ab7985c4899a60598dbce58afe155de18b42469: Status 404 returned error can't find the container with id 3f60ac14f4a73af351a1acc42ab7985c4899a60598dbce58afe155de18b42469 Nov 29 02:52:47 crc kubenswrapper[4749]: I1129 02:52:47.140075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87bb96969-7cm75" event={"ID":"76314112-321f-424a-9b69-08330318ec1f","Type":"ContainerStarted","Data":"a97a7a4864f2dc5b7f6fbf8b5b1b64d90042c35195e332692dffb58436359a77"} Nov 29 02:52:47 crc kubenswrapper[4749]: I1129 02:52:47.141797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fd97fcb5-fp48v" event={"ID":"66b06843-2245-4475-b1dc-89f6015a10dc","Type":"ContainerStarted","Data":"dfc69af172d9d381ce2be39f74a0053b5e08d0c482f1cf39b457c7c2df25df6d"} Nov 29 02:52:48 crc kubenswrapper[4749]: I1129 02:52:48.158385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7849bb687c-m7rzj" event={"ID":"57cb1c71-97d0-4742-be6b-4ce763c9b51d","Type":"ContainerStarted","Data":"3f60ac14f4a73af351a1acc42ab7985c4899a60598dbce58afe155de18b42469"} Nov 29 02:52:48 crc kubenswrapper[4749]: I1129 02:52:48.796161 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.41:9292/healthcheck\": read tcp 10.217.0.2:54348->10.217.1.41:9292: read: connection reset by peer" Nov 29 02:52:48 crc kubenswrapper[4749]: I1129 02:52:48.796173 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.41:9292/healthcheck\": read tcp 10.217.0.2:54338->10.217.1.41:9292: read: connection reset by peer" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.020068 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.42:9292/healthcheck\": read tcp 10.217.0.2:35008->10.217.1.42:9292: read: connection reset by peer" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.020096 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.42:9292/healthcheck\": read tcp 10.217.0.2:35012->10.217.1.42:9292: read: connection reset by peer" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.191411 4749 generic.go:334] "Generic (PLEG): container finished" podID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerID="b4b474e25fca289df9a0f5c7b40be14890f107ba84fd7a0285968d84847ff1f4" exitCode=0 Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.191492 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"282ac302-d5c0-4c95-b5d5-708cbaa5fc18","Type":"ContainerDied","Data":"b4b474e25fca289df9a0f5c7b40be14890f107ba84fd7a0285968d84847ff1f4"} Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.193839 4749 generic.go:334] "Generic (PLEG): container finished" podID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerID="86c570737b7092e8f37dab2c41692f1a45efbe0d3366fb17c0d1f4ebc4e850fa" exitCode=0 Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.193868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"073b63e6-4340-476c-bccb-23bbd6c43cf9","Type":"ContainerDied","Data":"86c570737b7092e8f37dab2c41692f1a45efbe0d3366fb17c0d1f4ebc4e850fa"} Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.317937 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.407219 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-logs\") pod \"073b63e6-4340-476c-bccb-23bbd6c43cf9\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.407310 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-ceph\") pod \"073b63e6-4340-476c-bccb-23bbd6c43cf9\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.408015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-logs" (OuterVolumeSpecName: "logs") pod "073b63e6-4340-476c-bccb-23bbd6c43cf9" (UID: "073b63e6-4340-476c-bccb-23bbd6c43cf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.408370 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8994g\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-kube-api-access-8994g\") pod \"073b63e6-4340-476c-bccb-23bbd6c43cf9\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.408547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-config-data\") pod \"073b63e6-4340-476c-bccb-23bbd6c43cf9\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.408595 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-combined-ca-bundle\") pod \"073b63e6-4340-476c-bccb-23bbd6c43cf9\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.408616 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-httpd-run\") pod \"073b63e6-4340-476c-bccb-23bbd6c43cf9\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.408682 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-scripts\") pod \"073b63e6-4340-476c-bccb-23bbd6c43cf9\" (UID: \"073b63e6-4340-476c-bccb-23bbd6c43cf9\") " Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.409223 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.410508 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "073b63e6-4340-476c-bccb-23bbd6c43cf9" (UID: "073b63e6-4340-476c-bccb-23bbd6c43cf9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.413772 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-scripts" (OuterVolumeSpecName: "scripts") pod "073b63e6-4340-476c-bccb-23bbd6c43cf9" (UID: "073b63e6-4340-476c-bccb-23bbd6c43cf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.418653 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-kube-api-access-8994g" (OuterVolumeSpecName: "kube-api-access-8994g") pod "073b63e6-4340-476c-bccb-23bbd6c43cf9" (UID: "073b63e6-4340-476c-bccb-23bbd6c43cf9"). InnerVolumeSpecName "kube-api-access-8994g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.418889 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-ceph" (OuterVolumeSpecName: "ceph") pod "073b63e6-4340-476c-bccb-23bbd6c43cf9" (UID: "073b63e6-4340-476c-bccb-23bbd6c43cf9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.437541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "073b63e6-4340-476c-bccb-23bbd6c43cf9" (UID: "073b63e6-4340-476c-bccb-23bbd6c43cf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.484346 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-config-data" (OuterVolumeSpecName: "config-data") pod "073b63e6-4340-476c-bccb-23bbd6c43cf9" (UID: "073b63e6-4340-476c-bccb-23bbd6c43cf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.510581 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.510617 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.510629 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/073b63e6-4340-476c-bccb-23bbd6c43cf9-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.510639 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/073b63e6-4340-476c-bccb-23bbd6c43cf9-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.510647 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:49 crc kubenswrapper[4749]: I1129 02:52:49.510655 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8994g\" (UniqueName: \"kubernetes.io/projected/073b63e6-4340-476c-bccb-23bbd6c43cf9-kube-api-access-8994g\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.204085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"073b63e6-4340-476c-bccb-23bbd6c43cf9","Type":"ContainerDied","Data":"94b04ddf20a343d4db492cd73f639dc05ca56fa859653f37575ae66d9ca37377"} Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.204136 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.204364 4749 scope.go:117] "RemoveContainer" containerID="86c570737b7092e8f37dab2c41692f1a45efbe0d3366fb17c0d1f4ebc4e850fa" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.238544 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.255683 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.266395 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:52:50 crc kubenswrapper[4749]: E1129 02:52:50.268054 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-log" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.268173 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-log" Nov 29 02:52:50 crc kubenswrapper[4749]: E1129 02:52:50.268373 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-httpd" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.268456 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-httpd" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.293517 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-httpd" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.293588 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" containerName="glance-log" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.301376 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.307288 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.329050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.456098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d1f66a-70df-410a-9b29-bd418f5ba498-logs\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.456195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv2vn\" (UniqueName: \"kubernetes.io/projected/a0d1f66a-70df-410a-9b29-bd418f5ba498-kube-api-access-kv2vn\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.456301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0d1f66a-70df-410a-9b29-bd418f5ba498-ceph\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.456337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.456384 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.456531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0d1f66a-70df-410a-9b29-bd418f5ba498-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.456570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.558968 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv2vn\" (UniqueName: \"kubernetes.io/projected/a0d1f66a-70df-410a-9b29-bd418f5ba498-kube-api-access-kv2vn\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.559054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0d1f66a-70df-410a-9b29-bd418f5ba498-ceph\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.559097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.559147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.559243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0d1f66a-70df-410a-9b29-bd418f5ba498-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.559270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.559858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d1f66a-70df-410a-9b29-bd418f5ba498-logs\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.560416 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0d1f66a-70df-410a-9b29-bd418f5ba498-logs\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.561141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0d1f66a-70df-410a-9b29-bd418f5ba498-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.567021 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0d1f66a-70df-410a-9b29-bd418f5ba498-ceph\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.568094 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-config-data\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.577945 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-scripts\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.596245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d1f66a-70df-410a-9b29-bd418f5ba498-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.615245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv2vn\" (UniqueName: \"kubernetes.io/projected/a0d1f66a-70df-410a-9b29-bd418f5ba498-kube-api-access-kv2vn\") pod \"glance-default-external-api-0\" (UID: \"a0d1f66a-70df-410a-9b29-bd418f5ba498\") " pod="openstack/glance-default-external-api-0" Nov 29 02:52:50 crc kubenswrapper[4749]: I1129 02:52:50.635141 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 29 02:52:51 crc kubenswrapper[4749]: I1129 02:52:51.091273 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073b63e6-4340-476c-bccb-23bbd6c43cf9" path="/var/lib/kubelet/pods/073b63e6-4340-476c-bccb-23bbd6c43cf9/volumes" Nov 29 02:52:54 crc kubenswrapper[4749]: I1129 02:52:54.792312 4749 scope.go:117] "RemoveContainer" containerID="bc21b65a113c4ec0128958fd2424033a7bf8824181a1d2f04d4c0311ea08185a" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.037348 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.173014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-combined-ca-bundle\") pod \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.173072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-scripts\") pod \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.173104 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gp6p\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-kube-api-access-7gp6p\") pod \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.173145 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-config-data\") pod \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.173197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-ceph\") pod \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.173235 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-logs\") pod \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.173295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-httpd-run\") pod \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\" (UID: \"282ac302-d5c0-4c95-b5d5-708cbaa5fc18\") " Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.174169 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "282ac302-d5c0-4c95-b5d5-708cbaa5fc18" (UID: "282ac302-d5c0-4c95-b5d5-708cbaa5fc18"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.176179 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-logs" (OuterVolumeSpecName: "logs") pod "282ac302-d5c0-4c95-b5d5-708cbaa5fc18" (UID: "282ac302-d5c0-4c95-b5d5-708cbaa5fc18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.179512 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-kube-api-access-7gp6p" (OuterVolumeSpecName: "kube-api-access-7gp6p") pod "282ac302-d5c0-4c95-b5d5-708cbaa5fc18" (UID: "282ac302-d5c0-4c95-b5d5-708cbaa5fc18"). InnerVolumeSpecName "kube-api-access-7gp6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.179662 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-scripts" (OuterVolumeSpecName: "scripts") pod "282ac302-d5c0-4c95-b5d5-708cbaa5fc18" (UID: "282ac302-d5c0-4c95-b5d5-708cbaa5fc18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.179888 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-ceph" (OuterVolumeSpecName: "ceph") pod "282ac302-d5c0-4c95-b5d5-708cbaa5fc18" (UID: "282ac302-d5c0-4c95-b5d5-708cbaa5fc18"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.257466 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "282ac302-d5c0-4c95-b5d5-708cbaa5fc18" (UID: "282ac302-d5c0-4c95-b5d5-708cbaa5fc18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.268055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fd97fcb5-fp48v" event={"ID":"66b06843-2245-4475-b1dc-89f6015a10dc","Type":"ContainerStarted","Data":"82d8499978b254cd5f0aaa57908aff4712cd2f06f50f07053572e0821e3648b4"} Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.271843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7849bb687c-m7rzj" event={"ID":"57cb1c71-97d0-4742-be6b-4ce763c9b51d","Type":"ContainerStarted","Data":"6a42efdc71f1f8074001d9d55efefb62cbc0d003658c233d1b05bec482b538a4"} Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"282ac302-d5c0-4c95-b5d5-708cbaa5fc18","Type":"ContainerDied","Data":"587f30738e34d7de46637308b74c44d72e891f6832a333f856a9d5daf69c07f9"} Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275613 4749 scope.go:117] "RemoveContainer" containerID="b4b474e25fca289df9a0f5c7b40be14890f107ba84fd7a0285968d84847ff1f4" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275656 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gp6p\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-kube-api-access-7gp6p\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275675 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275685 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275693 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275702 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275710 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.275741 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.283784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87bb96969-7cm75" event={"ID":"76314112-321f-424a-9b69-08330318ec1f","Type":"ContainerStarted","Data":"7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2"} Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.301548 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-config-data" (OuterVolumeSpecName: "config-data") pod "282ac302-d5c0-4c95-b5d5-708cbaa5fc18" (UID: "282ac302-d5c0-4c95-b5d5-708cbaa5fc18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.324719 4749 scope.go:117] "RemoveContainer" containerID="dcfa7565324a18c1ef573fc7854d07f9c9a0fe825fcaa9951a352bff69ecb5de" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.377252 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282ac302-d5c0-4c95-b5d5-708cbaa5fc18-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.470055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 29 02:52:55 crc kubenswrapper[4749]: W1129 02:52:55.477293 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0d1f66a_70df_410a_9b29_bd418f5ba498.slice/crio-a41c9274c8699b48c9d4dd676682357695b7fe472643291da69233fd0dbe49b5 WatchSource:0}: Error finding container a41c9274c8699b48c9d4dd676682357695b7fe472643291da69233fd0dbe49b5: Status 404 returned error can't find the container with id a41c9274c8699b48c9d4dd676682357695b7fe472643291da69233fd0dbe49b5 Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.617537 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.630021 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.637305 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:52:55 crc kubenswrapper[4749]: E1129 02:52:55.637670 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-log" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.637686 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-log" Nov 29 02:52:55 crc kubenswrapper[4749]: E1129 02:52:55.637716 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-httpd" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.637723 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-httpd" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.637894 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-httpd" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.637918 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" containerName="glance-log" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.638864 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.641286 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.653282 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.785335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt7mc\" (UniqueName: \"kubernetes.io/projected/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-kube-api-access-xt7mc\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.785421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.785466 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.785546 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.785844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.785917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.786055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.887693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.888060 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.888124 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.888172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt7mc\" (UniqueName: \"kubernetes.io/projected/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-kube-api-access-xt7mc\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.888245 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.888282 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.888338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.890338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.890605 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.896030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.896872 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.901345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.932048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.944030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt7mc\" (UniqueName: \"kubernetes.io/projected/ab76b9bf-6cf3-47b8-8356-32da5b8b939a-kube-api-access-xt7mc\") pod \"glance-default-internal-api-0\" (UID: \"ab76b9bf-6cf3-47b8-8356-32da5b8b939a\") " pod="openstack/glance-default-internal-api-0" Nov 29 02:52:55 crc kubenswrapper[4749]: I1129 02:52:55.958666 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.302024 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0d1f66a-70df-410a-9b29-bd418f5ba498","Type":"ContainerStarted","Data":"4796e85cb5c9268d94c435b878fdad3b8aede2492375453a519bfb1163defcea"} Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.302073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0d1f66a-70df-410a-9b29-bd418f5ba498","Type":"ContainerStarted","Data":"a41c9274c8699b48c9d4dd676682357695b7fe472643291da69233fd0dbe49b5"} Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.315120 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87bb96969-7cm75" event={"ID":"76314112-321f-424a-9b69-08330318ec1f","Type":"ContainerStarted","Data":"2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd"} Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.319658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fd97fcb5-fp48v" event={"ID":"66b06843-2245-4475-b1dc-89f6015a10dc","Type":"ContainerStarted","Data":"b4853470ee8eed8df2843e4e7701044f08b50f51f1f8c8bab08d4bca8708176f"} Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.319777 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67fd97fcb5-fp48v" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" containerName="horizon-log" containerID="cri-o://82d8499978b254cd5f0aaa57908aff4712cd2f06f50f07053572e0821e3648b4" gracePeriod=30 Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.319906 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67fd97fcb5-fp48v" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" containerName="horizon" containerID="cri-o://b4853470ee8eed8df2843e4e7701044f08b50f51f1f8c8bab08d4bca8708176f" gracePeriod=30 Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.326709 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7849bb687c-m7rzj" event={"ID":"57cb1c71-97d0-4742-be6b-4ce763c9b51d","Type":"ContainerStarted","Data":"3b972c0fc3f91c7775700cc247de4f2b7038f7a8f4c8d60bf70b4b5a61b20e21"} Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.349132 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-87bb96969-7cm75" podStartSLOduration=2.971074474 podStartE2EDuration="11.349115064s" podCreationTimestamp="2025-11-29 02:52:45 +0000 UTC" firstStartedPulling="2025-11-29 02:52:46.539637176 +0000 UTC m=+6109.711787033" lastFinishedPulling="2025-11-29 02:52:54.917677746 +0000 UTC m=+6118.089827623" observedRunningTime="2025-11-29 02:52:56.337462012 +0000 UTC m=+6119.509611889" watchObservedRunningTime="2025-11-29 02:52:56.349115064 +0000 UTC m=+6119.521264921" Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.371580 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7849bb687c-m7rzj" podStartSLOduration=2.5821986260000003 podStartE2EDuration="10.371561319s" podCreationTimestamp="2025-11-29 02:52:46 +0000 UTC" firstStartedPulling="2025-11-29 02:52:47.140028287 +0000 UTC m=+6110.312178154" lastFinishedPulling="2025-11-29 02:52:54.92939098 +0000 UTC m=+6118.101540847" observedRunningTime="2025-11-29 02:52:56.360655225 +0000 UTC m=+6119.532805102" watchObservedRunningTime="2025-11-29 02:52:56.371561319 +0000 UTC m=+6119.543711176" Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.567134 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67fd97fcb5-fp48v" podStartSLOduration=3.247908693 podStartE2EDuration="11.567113055s" podCreationTimestamp="2025-11-29 02:52:45 +0000 UTC" firstStartedPulling="2025-11-29 02:52:46.634913258 +0000 UTC m=+6109.807063115" lastFinishedPulling="2025-11-29 02:52:54.95411761 +0000 UTC m=+6118.126267477" observedRunningTime="2025-11-29 02:52:56.389876564 +0000 UTC m=+6119.562026421" watchObservedRunningTime="2025-11-29 02:52:56.567113055 +0000 UTC m=+6119.739262922" Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.568575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 29 02:52:56 crc kubenswrapper[4749]: W1129 02:52:56.576367 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab76b9bf_6cf3_47b8_8356_32da5b8b939a.slice/crio-f856e91f198121afa1f5359ce58751c80a58d859193871bb0f4b577a93e056d6 WatchSource:0}: Error finding container f856e91f198121afa1f5359ce58751c80a58d859193871bb0f4b577a93e056d6: Status 404 returned error can't find the container with id f856e91f198121afa1f5359ce58751c80a58d859193871bb0f4b577a93e056d6 Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.620488 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:56 crc kubenswrapper[4749]: I1129 02:52:56.620539 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:52:57 crc kubenswrapper[4749]: I1129 02:52:57.088107 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="282ac302-d5c0-4c95-b5d5-708cbaa5fc18" path="/var/lib/kubelet/pods/282ac302-d5c0-4c95-b5d5-708cbaa5fc18/volumes" Nov 29 02:52:57 crc kubenswrapper[4749]: I1129 02:52:57.347212 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a0d1f66a-70df-410a-9b29-bd418f5ba498","Type":"ContainerStarted","Data":"942265670b9ba5789c780f8c849a988e57025dc7aef1ec443fc17d0d4920f839"} Nov 29 02:52:57 crc kubenswrapper[4749]: I1129 02:52:57.349472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab76b9bf-6cf3-47b8-8356-32da5b8b939a","Type":"ContainerStarted","Data":"df0c0d8d7ddad8d87e32d60329782cd688f149e71798ce9234f585572039c976"} Nov 29 02:52:57 crc kubenswrapper[4749]: I1129 02:52:57.349551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab76b9bf-6cf3-47b8-8356-32da5b8b939a","Type":"ContainerStarted","Data":"f856e91f198121afa1f5359ce58751c80a58d859193871bb0f4b577a93e056d6"} Nov 29 02:52:57 crc kubenswrapper[4749]: I1129 02:52:57.382137 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.382119183 podStartE2EDuration="7.382119183s" podCreationTimestamp="2025-11-29 02:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:52:57.365646283 +0000 UTC m=+6120.537796130" watchObservedRunningTime="2025-11-29 02:52:57.382119183 +0000 UTC m=+6120.554269040" Nov 29 02:52:58 crc kubenswrapper[4749]: I1129 02:52:58.360990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab76b9bf-6cf3-47b8-8356-32da5b8b939a","Type":"ContainerStarted","Data":"ad3172a8d1af09af2b39263563b656966a6dd451ec27ffade537a9f8284bbfa9"} Nov 29 02:52:58 crc kubenswrapper[4749]: I1129 02:52:58.392746 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.392728919 podStartE2EDuration="3.392728919s" podCreationTimestamp="2025-11-29 02:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:52:58.380749258 +0000 UTC m=+6121.552899145" watchObservedRunningTime="2025-11-29 02:52:58.392728919 +0000 UTC m=+6121.564878776" Nov 29 02:53:00 crc kubenswrapper[4749]: I1129 02:53:00.640422 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 02:53:00 crc kubenswrapper[4749]: I1129 02:53:00.640764 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 29 02:53:00 crc kubenswrapper[4749]: I1129 02:53:00.681037 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 02:53:00 crc kubenswrapper[4749]: I1129 02:53:00.687293 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 29 02:53:01 crc kubenswrapper[4749]: I1129 02:53:01.388994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 02:53:01 crc kubenswrapper[4749]: I1129 02:53:01.389480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 29 02:53:03 crc kubenswrapper[4749]: I1129 02:53:03.410305 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 02:53:03 crc kubenswrapper[4749]: I1129 02:53:03.410634 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 29 02:53:03 crc kubenswrapper[4749]: I1129 02:53:03.728404 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 02:53:03 crc kubenswrapper[4749]: I1129 02:53:03.747297 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 29 02:53:05 crc kubenswrapper[4749]: I1129 02:53:05.913772 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:53:05 crc kubenswrapper[4749]: I1129 02:53:05.914138 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:53:05 crc kubenswrapper[4749]: I1129 02:53:05.916098 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-87bb96969-7cm75" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Nov 29 02:53:05 crc kubenswrapper[4749]: I1129 02:53:05.960144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 02:53:05 crc kubenswrapper[4749]: I1129 02:53:05.960243 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 29 02:53:06 crc kubenswrapper[4749]: I1129 02:53:06.000896 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 02:53:06 crc kubenswrapper[4749]: I1129 02:53:06.004178 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:53:06 crc kubenswrapper[4749]: I1129 02:53:06.024743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 29 02:53:06 crc kubenswrapper[4749]: I1129 02:53:06.487226 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 02:53:06 crc kubenswrapper[4749]: I1129 02:53:06.487588 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 29 02:53:06 crc kubenswrapper[4749]: I1129 02:53:06.624690 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7849bb687c-m7rzj" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 29 02:53:08 crc kubenswrapper[4749]: I1129 02:53:08.403142 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 02:53:08 crc kubenswrapper[4749]: I1129 02:53:08.411559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 29 02:53:10 crc kubenswrapper[4749]: I1129 02:53:10.077881 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-j7dsd"] Nov 29 02:53:10 crc kubenswrapper[4749]: I1129 02:53:10.091449 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-j7dsd"] Nov 29 02:53:11 crc kubenswrapper[4749]: I1129 02:53:11.053166 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4cdd-account-create-update-l6ljq"] Nov 29 02:53:11 crc kubenswrapper[4749]: I1129 02:53:11.069405 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4cdd-account-create-update-l6ljq"] Nov 29 02:53:11 crc kubenswrapper[4749]: I1129 02:53:11.092506 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ccb818-8eda-414d-9692-ded50b74bd5e" path="/var/lib/kubelet/pods/28ccb818-8eda-414d-9692-ded50b74bd5e/volumes" Nov 29 02:53:11 crc kubenswrapper[4749]: I1129 02:53:11.093932 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cecf171-c49e-4dd3-be48-a490e396333a" path="/var/lib/kubelet/pods/7cecf171-c49e-4dd3-be48-a490e396333a/volumes" Nov 29 02:53:16 crc kubenswrapper[4749]: I1129 02:53:16.107422 4749 scope.go:117] "RemoveContainer" containerID="440da4586c29b79db000ec02af6c4bd1a809ad83cd072266cb8db2c5162dac90" Nov 29 02:53:16 crc kubenswrapper[4749]: I1129 02:53:16.167977 4749 scope.go:117] "RemoveContainer" containerID="4dbc06eb7a24c743115062c55a43b788829cbebc53f725d4cfd5fa4cea148a16" Nov 29 02:53:16 crc kubenswrapper[4749]: I1129 02:53:16.238397 4749 scope.go:117] "RemoveContainer" containerID="b48ace4a6cdadc6166c13d965648a3a6d29494cc118013971549e3d2de5decb4" Nov 29 02:53:17 crc kubenswrapper[4749]: I1129 02:53:17.036719 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mjrvg"] Nov 29 02:53:17 crc kubenswrapper[4749]: I1129 02:53:17.049089 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mjrvg"] Nov 29 02:53:17 crc kubenswrapper[4749]: I1129 02:53:17.091863 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c8fbfd-ceea-44ee-9b20-b298bd7be81a" path="/var/lib/kubelet/pods/62c8fbfd-ceea-44ee-9b20-b298bd7be81a/volumes" Nov 29 02:53:17 crc kubenswrapper[4749]: I1129 02:53:17.740333 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:53:18 crc kubenswrapper[4749]: I1129 02:53:18.290397 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:53:19 crc kubenswrapper[4749]: I1129 02:53:19.242408 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:53:19 crc kubenswrapper[4749]: I1129 02:53:19.815976 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:53:19 crc kubenswrapper[4749]: I1129 02:53:19.912710 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-87bb96969-7cm75"] Nov 29 02:53:19 crc kubenswrapper[4749]: I1129 02:53:19.912991 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-87bb96969-7cm75" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon-log" containerID="cri-o://7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2" gracePeriod=30 Nov 29 02:53:19 crc kubenswrapper[4749]: I1129 02:53:19.913644 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-87bb96969-7cm75" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon" containerID="cri-o://2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd" gracePeriod=30 Nov 29 02:53:23 crc kubenswrapper[4749]: I1129 02:53:23.757399 4749 generic.go:334] "Generic (PLEG): container finished" podID="76314112-321f-424a-9b69-08330318ec1f" containerID="2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd" exitCode=0 Nov 29 02:53:23 crc kubenswrapper[4749]: I1129 02:53:23.757481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87bb96969-7cm75" event={"ID":"76314112-321f-424a-9b69-08330318ec1f","Type":"ContainerDied","Data":"2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd"} Nov 29 02:53:25 crc kubenswrapper[4749]: I1129 02:53:25.374242 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:53:25 crc kubenswrapper[4749]: I1129 02:53:25.375580 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:53:25 crc kubenswrapper[4749]: I1129 02:53:25.913976 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-87bb96969-7cm75" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.805016 4749 generic.go:334] "Generic (PLEG): container finished" podID="66b06843-2245-4475-b1dc-89f6015a10dc" containerID="b4853470ee8eed8df2843e4e7701044f08b50f51f1f8c8bab08d4bca8708176f" exitCode=137 Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.805389 4749 generic.go:334] "Generic (PLEG): container finished" podID="66b06843-2245-4475-b1dc-89f6015a10dc" containerID="82d8499978b254cd5f0aaa57908aff4712cd2f06f50f07053572e0821e3648b4" exitCode=137 Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.805106 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fd97fcb5-fp48v" event={"ID":"66b06843-2245-4475-b1dc-89f6015a10dc","Type":"ContainerDied","Data":"b4853470ee8eed8df2843e4e7701044f08b50f51f1f8c8bab08d4bca8708176f"} Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.805436 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fd97fcb5-fp48v" event={"ID":"66b06843-2245-4475-b1dc-89f6015a10dc","Type":"ContainerDied","Data":"82d8499978b254cd5f0aaa57908aff4712cd2f06f50f07053572e0821e3648b4"} Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.805451 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67fd97fcb5-fp48v" event={"ID":"66b06843-2245-4475-b1dc-89f6015a10dc","Type":"ContainerDied","Data":"dfc69af172d9d381ce2be39f74a0053b5e08d0c482f1cf39b457c7c2df25df6d"} Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.805463 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc69af172d9d381ce2be39f74a0053b5e08d0c482f1cf39b457c7c2df25df6d" Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.862406 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.963976 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b06843-2245-4475-b1dc-89f6015a10dc-logs\") pod \"66b06843-2245-4475-b1dc-89f6015a10dc\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.964049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-config-data\") pod \"66b06843-2245-4475-b1dc-89f6015a10dc\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.964073 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-scripts\") pod \"66b06843-2245-4475-b1dc-89f6015a10dc\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.964185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlvg7\" (UniqueName: \"kubernetes.io/projected/66b06843-2245-4475-b1dc-89f6015a10dc-kube-api-access-rlvg7\") pod \"66b06843-2245-4475-b1dc-89f6015a10dc\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.964229 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66b06843-2245-4475-b1dc-89f6015a10dc-horizon-secret-key\") pod \"66b06843-2245-4475-b1dc-89f6015a10dc\" (UID: \"66b06843-2245-4475-b1dc-89f6015a10dc\") " Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.966185 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b06843-2245-4475-b1dc-89f6015a10dc-logs" (OuterVolumeSpecName: "logs") pod "66b06843-2245-4475-b1dc-89f6015a10dc" (UID: "66b06843-2245-4475-b1dc-89f6015a10dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.970569 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b06843-2245-4475-b1dc-89f6015a10dc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "66b06843-2245-4475-b1dc-89f6015a10dc" (UID: "66b06843-2245-4475-b1dc-89f6015a10dc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:53:26 crc kubenswrapper[4749]: I1129 02:53:26.971643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b06843-2245-4475-b1dc-89f6015a10dc-kube-api-access-rlvg7" (OuterVolumeSpecName: "kube-api-access-rlvg7") pod "66b06843-2245-4475-b1dc-89f6015a10dc" (UID: "66b06843-2245-4475-b1dc-89f6015a10dc"). InnerVolumeSpecName "kube-api-access-rlvg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.002391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-scripts" (OuterVolumeSpecName: "scripts") pod "66b06843-2245-4475-b1dc-89f6015a10dc" (UID: "66b06843-2245-4475-b1dc-89f6015a10dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.014251 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-config-data" (OuterVolumeSpecName: "config-data") pod "66b06843-2245-4475-b1dc-89f6015a10dc" (UID: "66b06843-2245-4475-b1dc-89f6015a10dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.066549 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66b06843-2245-4475-b1dc-89f6015a10dc-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.066590 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.066603 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66b06843-2245-4475-b1dc-89f6015a10dc-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.066618 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlvg7\" (UniqueName: \"kubernetes.io/projected/66b06843-2245-4475-b1dc-89f6015a10dc-kube-api-access-rlvg7\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.066630 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/66b06843-2245-4475-b1dc-89f6015a10dc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.817729 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67fd97fcb5-fp48v" Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.858095 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67fd97fcb5-fp48v"] Nov 29 02:53:27 crc kubenswrapper[4749]: I1129 02:53:27.878671 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67fd97fcb5-fp48v"] Nov 29 02:53:29 crc kubenswrapper[4749]: I1129 02:53:29.096653 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" path="/var/lib/kubelet/pods/66b06843-2245-4475-b1dc-89f6015a10dc/volumes" Nov 29 02:53:35 crc kubenswrapper[4749]: I1129 02:53:35.914313 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-87bb96969-7cm75" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Nov 29 02:53:43 crc kubenswrapper[4749]: I1129 02:53:43.048276 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qqwz7"] Nov 29 02:53:43 crc kubenswrapper[4749]: I1129 02:53:43.060523 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b359-account-create-update-zpb5q"] Nov 29 02:53:43 crc kubenswrapper[4749]: I1129 02:53:43.071619 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b359-account-create-update-zpb5q"] Nov 29 02:53:43 crc kubenswrapper[4749]: I1129 02:53:43.091303 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f5007a-c36c-49c8-9be8-cb3ffe01bb03" path="/var/lib/kubelet/pods/a6f5007a-c36c-49c8-9be8-cb3ffe01bb03/volumes" Nov 29 02:53:43 crc kubenswrapper[4749]: I1129 02:53:43.092461 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qqwz7"] Nov 29 02:53:45 crc kubenswrapper[4749]: I1129 02:53:45.096045 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17fd7c23-34d8-447d-86e4-f3cb7a49f7ed" path="/var/lib/kubelet/pods/17fd7c23-34d8-447d-86e4-f3cb7a49f7ed/volumes" Nov 29 02:53:45 crc kubenswrapper[4749]: I1129 02:53:45.914755 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-87bb96969-7cm75" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Nov 29 02:53:45 crc kubenswrapper[4749]: I1129 02:53:45.915427 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.354325 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.444833 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-scripts\") pod \"76314112-321f-424a-9b69-08330318ec1f\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.444930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-config-data\") pod \"76314112-321f-424a-9b69-08330318ec1f\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.445002 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snh7t\" (UniqueName: \"kubernetes.io/projected/76314112-321f-424a-9b69-08330318ec1f-kube-api-access-snh7t\") pod \"76314112-321f-424a-9b69-08330318ec1f\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.445072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76314112-321f-424a-9b69-08330318ec1f-horizon-secret-key\") pod \"76314112-321f-424a-9b69-08330318ec1f\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.445141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76314112-321f-424a-9b69-08330318ec1f-logs\") pod \"76314112-321f-424a-9b69-08330318ec1f\" (UID: \"76314112-321f-424a-9b69-08330318ec1f\") " Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.445831 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76314112-321f-424a-9b69-08330318ec1f-logs" (OuterVolumeSpecName: "logs") pod "76314112-321f-424a-9b69-08330318ec1f" (UID: "76314112-321f-424a-9b69-08330318ec1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.446322 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76314112-321f-424a-9b69-08330318ec1f-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.451542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76314112-321f-424a-9b69-08330318ec1f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "76314112-321f-424a-9b69-08330318ec1f" (UID: "76314112-321f-424a-9b69-08330318ec1f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.451604 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76314112-321f-424a-9b69-08330318ec1f-kube-api-access-snh7t" (OuterVolumeSpecName: "kube-api-access-snh7t") pod "76314112-321f-424a-9b69-08330318ec1f" (UID: "76314112-321f-424a-9b69-08330318ec1f"). InnerVolumeSpecName "kube-api-access-snh7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.467130 4749 generic.go:334] "Generic (PLEG): container finished" podID="76314112-321f-424a-9b69-08330318ec1f" containerID="7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2" exitCode=137 Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.467220 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-87bb96969-7cm75" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.467296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87bb96969-7cm75" event={"ID":"76314112-321f-424a-9b69-08330318ec1f","Type":"ContainerDied","Data":"7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2"} Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.467743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-87bb96969-7cm75" event={"ID":"76314112-321f-424a-9b69-08330318ec1f","Type":"ContainerDied","Data":"a97a7a4864f2dc5b7f6fbf8b5b1b64d90042c35195e332692dffb58436359a77"} Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.467770 4749 scope.go:117] "RemoveContainer" containerID="2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.470987 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-config-data" (OuterVolumeSpecName: "config-data") pod "76314112-321f-424a-9b69-08330318ec1f" (UID: "76314112-321f-424a-9b69-08330318ec1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.471316 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-scripts" (OuterVolumeSpecName: "scripts") pod "76314112-321f-424a-9b69-08330318ec1f" (UID: "76314112-321f-424a-9b69-08330318ec1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.547742 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.547778 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76314112-321f-424a-9b69-08330318ec1f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.547791 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snh7t\" (UniqueName: \"kubernetes.io/projected/76314112-321f-424a-9b69-08330318ec1f-kube-api-access-snh7t\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.547802 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76314112-321f-424a-9b69-08330318ec1f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.693496 4749 scope.go:117] "RemoveContainer" containerID="7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.715876 4749 scope.go:117] "RemoveContainer" containerID="2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd" Nov 29 02:53:50 crc kubenswrapper[4749]: E1129 02:53:50.716405 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd\": container with ID starting with 2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd not found: ID does not exist" containerID="2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.716446 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd"} err="failed to get container status \"2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd\": rpc error: code = NotFound desc = could not find container \"2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd\": container with ID starting with 2cf581959172f35eed1cff609bb812671c11b3cabd46250df63ba122a1a3d1fd not found: ID does not exist" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.716466 4749 scope.go:117] "RemoveContainer" containerID="7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2" Nov 29 02:53:50 crc kubenswrapper[4749]: E1129 02:53:50.716808 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2\": container with ID starting with 7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2 not found: ID does not exist" containerID="7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.716871 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2"} err="failed to get container status \"7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2\": rpc error: code = NotFound desc = could not find container \"7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2\": container with ID starting with 7b87b790cbcf527e7968a0ece2f001569fd020b781f32332ec0aa6cb276d68a2 not found: ID does not exist" Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.815431 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-87bb96969-7cm75"] Nov 29 02:53:50 crc kubenswrapper[4749]: I1129 02:53:50.823729 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-87bb96969-7cm75"] Nov 29 02:53:51 crc kubenswrapper[4749]: I1129 02:53:51.039562 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-js449"] Nov 29 02:53:51 crc kubenswrapper[4749]: I1129 02:53:51.057242 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-js449"] Nov 29 02:53:51 crc kubenswrapper[4749]: I1129 02:53:51.098086 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76314112-321f-424a-9b69-08330318ec1f" path="/var/lib/kubelet/pods/76314112-321f-424a-9b69-08330318ec1f/volumes" Nov 29 02:53:51 crc kubenswrapper[4749]: I1129 02:53:51.100072 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98455d99-8cf1-4065-a59b-28454051102b" path="/var/lib/kubelet/pods/98455d99-8cf1-4065-a59b-28454051102b/volumes" Nov 29 02:53:55 crc kubenswrapper[4749]: I1129 02:53:55.374510 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:53:55 crc kubenswrapper[4749]: I1129 02:53:55.375042 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.515062 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56d7497497-ck5ws"] Nov 29 02:54:03 crc kubenswrapper[4749]: E1129 02:54:03.516068 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" containerName="horizon" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.516088 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" containerName="horizon" Nov 29 02:54:03 crc kubenswrapper[4749]: E1129 02:54:03.516110 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" containerName="horizon-log" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.516118 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" containerName="horizon-log" Nov 29 02:54:03 crc kubenswrapper[4749]: E1129 02:54:03.516164 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon-log" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.516172 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon-log" Nov 29 02:54:03 crc kubenswrapper[4749]: E1129 02:54:03.516184 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.516252 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.516486 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.516504 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="76314112-321f-424a-9b69-08330318ec1f" containerName="horizon-log" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.516514 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" containerName="horizon" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.516529 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b06843-2245-4475-b1dc-89f6015a10dc" containerName="horizon-log" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.517842 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.538750 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56d7497497-ck5ws"] Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.646455 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20808d24-277c-4c49-8c37-42d5e337cb3b-horizon-secret-key\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.646498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20808d24-277c-4c49-8c37-42d5e337cb3b-config-data\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.646545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20808d24-277c-4c49-8c37-42d5e337cb3b-logs\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.646577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdzff\" (UniqueName: \"kubernetes.io/projected/20808d24-277c-4c49-8c37-42d5e337cb3b-kube-api-access-hdzff\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.646616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20808d24-277c-4c49-8c37-42d5e337cb3b-scripts\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.748530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20808d24-277c-4c49-8c37-42d5e337cb3b-horizon-secret-key\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.748579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20808d24-277c-4c49-8c37-42d5e337cb3b-config-data\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.748626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20808d24-277c-4c49-8c37-42d5e337cb3b-logs\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.748660 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdzff\" (UniqueName: \"kubernetes.io/projected/20808d24-277c-4c49-8c37-42d5e337cb3b-kube-api-access-hdzff\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.748700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20808d24-277c-4c49-8c37-42d5e337cb3b-scripts\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.749142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20808d24-277c-4c49-8c37-42d5e337cb3b-logs\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.749385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20808d24-277c-4c49-8c37-42d5e337cb3b-scripts\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.749763 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20808d24-277c-4c49-8c37-42d5e337cb3b-config-data\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.765841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdzff\" (UniqueName: \"kubernetes.io/projected/20808d24-277c-4c49-8c37-42d5e337cb3b-kube-api-access-hdzff\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.768002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20808d24-277c-4c49-8c37-42d5e337cb3b-horizon-secret-key\") pod \"horizon-56d7497497-ck5ws\" (UID: \"20808d24-277c-4c49-8c37-42d5e337cb3b\") " pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:03 crc kubenswrapper[4749]: I1129 02:54:03.841518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.311411 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56d7497497-ck5ws"] Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.655219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56d7497497-ck5ws" event={"ID":"20808d24-277c-4c49-8c37-42d5e337cb3b","Type":"ContainerStarted","Data":"0318b98f9ffde10d14f7afe8aaec8777c7205902b2fa931f6a51dc1eef196494"} Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.655505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56d7497497-ck5ws" event={"ID":"20808d24-277c-4c49-8c37-42d5e337cb3b","Type":"ContainerStarted","Data":"ff8725841793fe03dab7205a2fb68af721baa376e847d79264ed9724776885d0"} Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.748320 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-cxgwg"] Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.750070 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.761402 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-cxgwg"] Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.842374 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-5b01-account-create-update-ljxkt"] Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.844143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.847909 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.853652 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5b01-account-create-update-ljxkt"] Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.876435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhmpm\" (UniqueName: \"kubernetes.io/projected/476687b6-9c04-4736-ad82-900e844ea6be-kube-api-access-lhmpm\") pod \"heat-db-create-cxgwg\" (UID: \"476687b6-9c04-4736-ad82-900e844ea6be\") " pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.876753 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/476687b6-9c04-4736-ad82-900e844ea6be-operator-scripts\") pod \"heat-db-create-cxgwg\" (UID: \"476687b6-9c04-4736-ad82-900e844ea6be\") " pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.978678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhmpm\" (UniqueName: \"kubernetes.io/projected/476687b6-9c04-4736-ad82-900e844ea6be-kube-api-access-lhmpm\") pod \"heat-db-create-cxgwg\" (UID: \"476687b6-9c04-4736-ad82-900e844ea6be\") " pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.979046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsrq8\" (UniqueName: \"kubernetes.io/projected/3ff1338c-165a-47af-ad4e-275d7b90dd87-kube-api-access-lsrq8\") pod \"heat-5b01-account-create-update-ljxkt\" (UID: \"3ff1338c-165a-47af-ad4e-275d7b90dd87\") " pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.979166 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/476687b6-9c04-4736-ad82-900e844ea6be-operator-scripts\") pod \"heat-db-create-cxgwg\" (UID: \"476687b6-9c04-4736-ad82-900e844ea6be\") " pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.979419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1338c-165a-47af-ad4e-275d7b90dd87-operator-scripts\") pod \"heat-5b01-account-create-update-ljxkt\" (UID: \"3ff1338c-165a-47af-ad4e-275d7b90dd87\") " pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.979877 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/476687b6-9c04-4736-ad82-900e844ea6be-operator-scripts\") pod \"heat-db-create-cxgwg\" (UID: \"476687b6-9c04-4736-ad82-900e844ea6be\") " pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:04 crc kubenswrapper[4749]: I1129 02:54:04.997759 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhmpm\" (UniqueName: \"kubernetes.io/projected/476687b6-9c04-4736-ad82-900e844ea6be-kube-api-access-lhmpm\") pod \"heat-db-create-cxgwg\" (UID: \"476687b6-9c04-4736-ad82-900e844ea6be\") " pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.080440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1338c-165a-47af-ad4e-275d7b90dd87-operator-scripts\") pod \"heat-5b01-account-create-update-ljxkt\" (UID: \"3ff1338c-165a-47af-ad4e-275d7b90dd87\") " pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.080529 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsrq8\" (UniqueName: \"kubernetes.io/projected/3ff1338c-165a-47af-ad4e-275d7b90dd87-kube-api-access-lsrq8\") pod \"heat-5b01-account-create-update-ljxkt\" (UID: \"3ff1338c-165a-47af-ad4e-275d7b90dd87\") " pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.081619 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1338c-165a-47af-ad4e-275d7b90dd87-operator-scripts\") pod \"heat-5b01-account-create-update-ljxkt\" (UID: \"3ff1338c-165a-47af-ad4e-275d7b90dd87\") " pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.108747 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsrq8\" (UniqueName: \"kubernetes.io/projected/3ff1338c-165a-47af-ad4e-275d7b90dd87-kube-api-access-lsrq8\") pod \"heat-5b01-account-create-update-ljxkt\" (UID: \"3ff1338c-165a-47af-ad4e-275d7b90dd87\") " pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.117970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.158386 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.595917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-cxgwg"] Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.663236 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5b01-account-create-update-ljxkt"] Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.666975 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56d7497497-ck5ws" event={"ID":"20808d24-277c-4c49-8c37-42d5e337cb3b","Type":"ContainerStarted","Data":"7d1a2c431efeb33c8721aa1d222d8c3ad007f9dd824b9b71d3d8f452e75e0565"} Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.668908 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-cxgwg" event={"ID":"476687b6-9c04-4736-ad82-900e844ea6be","Type":"ContainerStarted","Data":"e3f2a8506ab2289f0c2e6f186044bfbfb49be419d239e81ccab40e5f79371e78"} Nov 29 02:54:05 crc kubenswrapper[4749]: W1129 02:54:05.690961 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ff1338c_165a_47af_ad4e_275d7b90dd87.slice/crio-e5dbb007e0f7b30ed7bb9f3ede34a42610488d5e64d395eb5f4de080b8fc4419 WatchSource:0}: Error finding container e5dbb007e0f7b30ed7bb9f3ede34a42610488d5e64d395eb5f4de080b8fc4419: Status 404 returned error can't find the container with id e5dbb007e0f7b30ed7bb9f3ede34a42610488d5e64d395eb5f4de080b8fc4419 Nov 29 02:54:05 crc kubenswrapper[4749]: I1129 02:54:05.692971 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56d7497497-ck5ws" podStartSLOduration=2.692954446 podStartE2EDuration="2.692954446s" podCreationTimestamp="2025-11-29 02:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:54:05.6906371 +0000 UTC m=+6188.862786977" watchObservedRunningTime="2025-11-29 02:54:05.692954446 +0000 UTC m=+6188.865104303" Nov 29 02:54:06 crc kubenswrapper[4749]: I1129 02:54:06.682990 4749 generic.go:334] "Generic (PLEG): container finished" podID="476687b6-9c04-4736-ad82-900e844ea6be" containerID="b11465d1097dbefe618db5f6c2af03a9a67ed514c1c471dc3adeb7aedd44714a" exitCode=0 Nov 29 02:54:06 crc kubenswrapper[4749]: I1129 02:54:06.683445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-cxgwg" event={"ID":"476687b6-9c04-4736-ad82-900e844ea6be","Type":"ContainerDied","Data":"b11465d1097dbefe618db5f6c2af03a9a67ed514c1c471dc3adeb7aedd44714a"} Nov 29 02:54:06 crc kubenswrapper[4749]: I1129 02:54:06.687505 4749 generic.go:334] "Generic (PLEG): container finished" podID="3ff1338c-165a-47af-ad4e-275d7b90dd87" containerID="486e18a6d2bae3cbb3c65ff282a069114dde06569453785301ea7f164ce9c773" exitCode=0 Nov 29 02:54:06 crc kubenswrapper[4749]: I1129 02:54:06.687628 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5b01-account-create-update-ljxkt" event={"ID":"3ff1338c-165a-47af-ad4e-275d7b90dd87","Type":"ContainerDied","Data":"486e18a6d2bae3cbb3c65ff282a069114dde06569453785301ea7f164ce9c773"} Nov 29 02:54:06 crc kubenswrapper[4749]: I1129 02:54:06.687679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5b01-account-create-update-ljxkt" event={"ID":"3ff1338c-165a-47af-ad4e-275d7b90dd87","Type":"ContainerStarted","Data":"e5dbb007e0f7b30ed7bb9f3ede34a42610488d5e64d395eb5f4de080b8fc4419"} Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.312636 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.318334 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.462008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsrq8\" (UniqueName: \"kubernetes.io/projected/3ff1338c-165a-47af-ad4e-275d7b90dd87-kube-api-access-lsrq8\") pod \"3ff1338c-165a-47af-ad4e-275d7b90dd87\" (UID: \"3ff1338c-165a-47af-ad4e-275d7b90dd87\") " Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.462088 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1338c-165a-47af-ad4e-275d7b90dd87-operator-scripts\") pod \"3ff1338c-165a-47af-ad4e-275d7b90dd87\" (UID: \"3ff1338c-165a-47af-ad4e-275d7b90dd87\") " Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.462150 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/476687b6-9c04-4736-ad82-900e844ea6be-operator-scripts\") pod \"476687b6-9c04-4736-ad82-900e844ea6be\" (UID: \"476687b6-9c04-4736-ad82-900e844ea6be\") " Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.462230 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhmpm\" (UniqueName: \"kubernetes.io/projected/476687b6-9c04-4736-ad82-900e844ea6be-kube-api-access-lhmpm\") pod \"476687b6-9c04-4736-ad82-900e844ea6be\" (UID: \"476687b6-9c04-4736-ad82-900e844ea6be\") " Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.463135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ff1338c-165a-47af-ad4e-275d7b90dd87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ff1338c-165a-47af-ad4e-275d7b90dd87" (UID: "3ff1338c-165a-47af-ad4e-275d7b90dd87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.463186 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476687b6-9c04-4736-ad82-900e844ea6be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "476687b6-9c04-4736-ad82-900e844ea6be" (UID: "476687b6-9c04-4736-ad82-900e844ea6be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.468468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476687b6-9c04-4736-ad82-900e844ea6be-kube-api-access-lhmpm" (OuterVolumeSpecName: "kube-api-access-lhmpm") pod "476687b6-9c04-4736-ad82-900e844ea6be" (UID: "476687b6-9c04-4736-ad82-900e844ea6be"). InnerVolumeSpecName "kube-api-access-lhmpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.482026 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff1338c-165a-47af-ad4e-275d7b90dd87-kube-api-access-lsrq8" (OuterVolumeSpecName: "kube-api-access-lsrq8") pod "3ff1338c-165a-47af-ad4e-275d7b90dd87" (UID: "3ff1338c-165a-47af-ad4e-275d7b90dd87"). InnerVolumeSpecName "kube-api-access-lsrq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.564996 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/476687b6-9c04-4736-ad82-900e844ea6be-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.565049 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhmpm\" (UniqueName: \"kubernetes.io/projected/476687b6-9c04-4736-ad82-900e844ea6be-kube-api-access-lhmpm\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.565072 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsrq8\" (UniqueName: \"kubernetes.io/projected/3ff1338c-165a-47af-ad4e-275d7b90dd87-kube-api-access-lsrq8\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.565089 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ff1338c-165a-47af-ad4e-275d7b90dd87-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.715505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-cxgwg" event={"ID":"476687b6-9c04-4736-ad82-900e844ea6be","Type":"ContainerDied","Data":"e3f2a8506ab2289f0c2e6f186044bfbfb49be419d239e81ccab40e5f79371e78"} Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.715596 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f2a8506ab2289f0c2e6f186044bfbfb49be419d239e81ccab40e5f79371e78" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.715540 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-cxgwg" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.718340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5b01-account-create-update-ljxkt" event={"ID":"3ff1338c-165a-47af-ad4e-275d7b90dd87","Type":"ContainerDied","Data":"e5dbb007e0f7b30ed7bb9f3ede34a42610488d5e64d395eb5f4de080b8fc4419"} Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.718393 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5dbb007e0f7b30ed7bb9f3ede34a42610488d5e64d395eb5f4de080b8fc4419" Nov 29 02:54:08 crc kubenswrapper[4749]: I1129 02:54:08.718430 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5b01-account-create-update-ljxkt" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.053469 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-lvfcj"] Nov 29 02:54:10 crc kubenswrapper[4749]: E1129 02:54:10.055084 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476687b6-9c04-4736-ad82-900e844ea6be" containerName="mariadb-database-create" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.055223 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="476687b6-9c04-4736-ad82-900e844ea6be" containerName="mariadb-database-create" Nov 29 02:54:10 crc kubenswrapper[4749]: E1129 02:54:10.055376 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff1338c-165a-47af-ad4e-275d7b90dd87" containerName="mariadb-account-create-update" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.055457 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff1338c-165a-47af-ad4e-275d7b90dd87" containerName="mariadb-account-create-update" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.055784 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff1338c-165a-47af-ad4e-275d7b90dd87" containerName="mariadb-account-create-update" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.055914 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="476687b6-9c04-4736-ad82-900e844ea6be" containerName="mariadb-database-create" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.057041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.062557 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.063661 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-q4sh9" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.073109 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-lvfcj"] Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.213954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-config-data\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.214255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lvxz\" (UniqueName: \"kubernetes.io/projected/304202b2-d372-4ab5-95f7-b77b94748b1a-kube-api-access-5lvxz\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.215369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-combined-ca-bundle\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.317916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-config-data\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.318108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lvxz\" (UniqueName: \"kubernetes.io/projected/304202b2-d372-4ab5-95f7-b77b94748b1a-kube-api-access-5lvxz\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.318261 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-combined-ca-bundle\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.323775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-config-data\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.335010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-combined-ca-bundle\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.352007 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lvxz\" (UniqueName: \"kubernetes.io/projected/304202b2-d372-4ab5-95f7-b77b94748b1a-kube-api-access-5lvxz\") pod \"heat-db-sync-lvfcj\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.382575 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:10 crc kubenswrapper[4749]: I1129 02:54:10.873117 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-lvfcj"] Nov 29 02:54:11 crc kubenswrapper[4749]: I1129 02:54:11.746370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lvfcj" event={"ID":"304202b2-d372-4ab5-95f7-b77b94748b1a","Type":"ContainerStarted","Data":"f1d0d094338b938dc8560565b8824360db49d8f3e97551b595855ef3cb2bcdc2"} Nov 29 02:54:13 crc kubenswrapper[4749]: I1129 02:54:13.842251 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:13 crc kubenswrapper[4749]: I1129 02:54:13.842988 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:16 crc kubenswrapper[4749]: I1129 02:54:16.420135 4749 scope.go:117] "RemoveContainer" containerID="b63741453603aecbda87cee7cfd9bcf6743ef5a280be203156ebb303cba77243" Nov 29 02:54:19 crc kubenswrapper[4749]: I1129 02:54:19.964475 4749 scope.go:117] "RemoveContainer" containerID="a2021300eb0e5267251914a7b74251631fdb371ca3421dfa0860ed477adeacc6" Nov 29 02:54:20 crc kubenswrapper[4749]: I1129 02:54:20.013456 4749 scope.go:117] "RemoveContainer" containerID="fb1f666098c5292e037c4dd5ffd719561c294f91fdb39f539dc5bfbe2b663775" Nov 29 02:54:20 crc kubenswrapper[4749]: I1129 02:54:20.075918 4749 scope.go:117] "RemoveContainer" containerID="a4631435c9aeaacf2ca5b3f8179670453fd4edc6a8f1d7e9d4d36ef8d8808c7b" Nov 29 02:54:20 crc kubenswrapper[4749]: I1129 02:54:20.870491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lvfcj" event={"ID":"304202b2-d372-4ab5-95f7-b77b94748b1a","Type":"ContainerStarted","Data":"1a88eedfacc04edc798be76f032f37259b34203383e686ac112a0ca43b037bbe"} Nov 29 02:54:22 crc kubenswrapper[4749]: I1129 02:54:22.895643 4749 generic.go:334] "Generic (PLEG): container finished" podID="304202b2-d372-4ab5-95f7-b77b94748b1a" containerID="1a88eedfacc04edc798be76f032f37259b34203383e686ac112a0ca43b037bbe" exitCode=0 Nov 29 02:54:22 crc kubenswrapper[4749]: I1129 02:54:22.895824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lvfcj" event={"ID":"304202b2-d372-4ab5-95f7-b77b94748b1a","Type":"ContainerDied","Data":"1a88eedfacc04edc798be76f032f37259b34203383e686ac112a0ca43b037bbe"} Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.284712 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.397706 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-config-data\") pod \"304202b2-d372-4ab5-95f7-b77b94748b1a\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.398311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lvxz\" (UniqueName: \"kubernetes.io/projected/304202b2-d372-4ab5-95f7-b77b94748b1a-kube-api-access-5lvxz\") pod \"304202b2-d372-4ab5-95f7-b77b94748b1a\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.398422 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-combined-ca-bundle\") pod \"304202b2-d372-4ab5-95f7-b77b94748b1a\" (UID: \"304202b2-d372-4ab5-95f7-b77b94748b1a\") " Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.411438 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304202b2-d372-4ab5-95f7-b77b94748b1a-kube-api-access-5lvxz" (OuterVolumeSpecName: "kube-api-access-5lvxz") pod "304202b2-d372-4ab5-95f7-b77b94748b1a" (UID: "304202b2-d372-4ab5-95f7-b77b94748b1a"). InnerVolumeSpecName "kube-api-access-5lvxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.443223 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "304202b2-d372-4ab5-95f7-b77b94748b1a" (UID: "304202b2-d372-4ab5-95f7-b77b94748b1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.486381 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-config-data" (OuterVolumeSpecName: "config-data") pod "304202b2-d372-4ab5-95f7-b77b94748b1a" (UID: "304202b2-d372-4ab5-95f7-b77b94748b1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.500842 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.500866 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lvxz\" (UniqueName: \"kubernetes.io/projected/304202b2-d372-4ab5-95f7-b77b94748b1a-kube-api-access-5lvxz\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.500876 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/304202b2-d372-4ab5-95f7-b77b94748b1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.920777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-lvfcj" event={"ID":"304202b2-d372-4ab5-95f7-b77b94748b1a","Type":"ContainerDied","Data":"f1d0d094338b938dc8560565b8824360db49d8f3e97551b595855ef3cb2bcdc2"} Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.920812 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d0d094338b938dc8560565b8824360db49d8f3e97551b595855ef3cb2bcdc2" Nov 29 02:54:24 crc kubenswrapper[4749]: I1129 02:54:24.920876 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-lvfcj" Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.373926 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.374168 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.374218 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.374897 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab08658feaa017e23d82e44bd49750736188bccc502a12cbbe95310295445311"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.374937 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://ab08658feaa017e23d82e44bd49750736188bccc502a12cbbe95310295445311" gracePeriod=600 Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.676890 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.937092 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="ab08658feaa017e23d82e44bd49750736188bccc502a12cbbe95310295445311" exitCode=0 Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.937179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"ab08658feaa017e23d82e44bd49750736188bccc502a12cbbe95310295445311"} Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.937427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48"} Nov 29 02:54:25 crc kubenswrapper[4749]: I1129 02:54:25.937456 4749 scope.go:117] "RemoveContainer" containerID="8864f975d781e026a0add9aee7d983121f8ef021fbc7718b34e2ebd77d0e4dbd" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.102070 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5c7858b669-qhgk5"] Nov 29 02:54:26 crc kubenswrapper[4749]: E1129 02:54:26.102987 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304202b2-d372-4ab5-95f7-b77b94748b1a" containerName="heat-db-sync" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.103016 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="304202b2-d372-4ab5-95f7-b77b94748b1a" containerName="heat-db-sync" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.103263 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="304202b2-d372-4ab5-95f7-b77b94748b1a" containerName="heat-db-sync" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.104380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.107121 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.107256 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-q4sh9" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.110561 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.121334 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5c7858b669-qhgk5"] Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.237846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz6xl\" (UniqueName: \"kubernetes.io/projected/29f760d9-0335-46d5-b098-1df1cf5067e0-kube-api-access-fz6xl\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.238296 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-config-data-custom\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.238430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-combined-ca-bundle\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.238655 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-config-data\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.240089 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-74ddcf9444-kzl5n"] Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.241505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.244755 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.257629 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-74ddcf9444-kzl5n"] Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.341765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-config-data-custom\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.342397 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-combined-ca-bundle\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.342451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-combined-ca-bundle\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.342519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-config-data\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.342644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-config-data\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.342668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz6xl\" (UniqueName: \"kubernetes.io/projected/29f760d9-0335-46d5-b098-1df1cf5067e0-kube-api-access-fz6xl\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.342734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-config-data-custom\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.342768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trjx2\" (UniqueName: \"kubernetes.io/projected/79248744-29f0-43bd-b44a-a4b8c42aae39-kube-api-access-trjx2\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.351207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-combined-ca-bundle\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.359782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-config-data-custom\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.364977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz6xl\" (UniqueName: \"kubernetes.io/projected/29f760d9-0335-46d5-b098-1df1cf5067e0-kube-api-access-fz6xl\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.388337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29f760d9-0335-46d5-b098-1df1cf5067e0-config-data\") pod \"heat-engine-5c7858b669-qhgk5\" (UID: \"29f760d9-0335-46d5-b098-1df1cf5067e0\") " pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.426404 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-588c68cbfd-bccbx"] Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.428058 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.431688 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.432232 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.444607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trjx2\" (UniqueName: \"kubernetes.io/projected/79248744-29f0-43bd-b44a-a4b8c42aae39-kube-api-access-trjx2\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.444673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-config-data-custom\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.444727 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-combined-ca-bundle\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.444772 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-config-data\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.461660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-config-data-custom\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.461773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-config-data\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.465805 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trjx2\" (UniqueName: \"kubernetes.io/projected/79248744-29f0-43bd-b44a-a4b8c42aae39-kube-api-access-trjx2\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.466366 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-588c68cbfd-bccbx"] Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.469832 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79248744-29f0-43bd-b44a-a4b8c42aae39-combined-ca-bundle\") pod \"heat-api-74ddcf9444-kzl5n\" (UID: \"79248744-29f0-43bd-b44a-a4b8c42aae39\") " pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.549636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-combined-ca-bundle\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.549918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-config-data\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.549947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27slt\" (UniqueName: \"kubernetes.io/projected/1fda202d-bbc6-494b-89ff-e49cff899f83-kube-api-access-27slt\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.549973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-config-data-custom\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.562251 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.652249 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-config-data\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.652515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27slt\" (UniqueName: \"kubernetes.io/projected/1fda202d-bbc6-494b-89ff-e49cff899f83-kube-api-access-27slt\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.652541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-config-data-custom\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.652653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-combined-ca-bundle\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.657838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-combined-ca-bundle\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.658268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-config-data\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.658485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1fda202d-bbc6-494b-89ff-e49cff899f83-config-data-custom\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.683884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27slt\" (UniqueName: \"kubernetes.io/projected/1fda202d-bbc6-494b-89ff-e49cff899f83-kube-api-access-27slt\") pod \"heat-cfnapi-588c68cbfd-bccbx\" (UID: \"1fda202d-bbc6-494b-89ff-e49cff899f83\") " pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:26 crc kubenswrapper[4749]: I1129 02:54:26.837210 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.010635 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5c7858b669-qhgk5"] Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.151664 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-74ddcf9444-kzl5n"] Nov 29 02:54:27 crc kubenswrapper[4749]: W1129 02:54:27.317607 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fda202d_bbc6_494b_89ff_e49cff899f83.slice/crio-6b6ec7b22d2cee0e4803f4799a86b7f30c462e5294ee304dcde86a3876f5ee05 WatchSource:0}: Error finding container 6b6ec7b22d2cee0e4803f4799a86b7f30c462e5294ee304dcde86a3876f5ee05: Status 404 returned error can't find the container with id 6b6ec7b22d2cee0e4803f4799a86b7f30c462e5294ee304dcde86a3876f5ee05 Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.320000 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-588c68cbfd-bccbx"] Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.945796 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-56d7497497-ck5ws" Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.978870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5c7858b669-qhgk5" event={"ID":"29f760d9-0335-46d5-b098-1df1cf5067e0","Type":"ContainerStarted","Data":"a747a044c128f052ae45b6241df0f8157687dc73bb478165a95392760cbd8c9a"} Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.978917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5c7858b669-qhgk5" event={"ID":"29f760d9-0335-46d5-b098-1df1cf5067e0","Type":"ContainerStarted","Data":"ac13fb1e436aa897227c2208db18c9e485f7f7843635cbc2df0a4d8f182f96d4"} Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.979013 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.980084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74ddcf9444-kzl5n" event={"ID":"79248744-29f0-43bd-b44a-a4b8c42aae39","Type":"ContainerStarted","Data":"d6a20af494929685690f49a97b300798808465d402e2d747b5895b1da0b35ff0"} Nov 29 02:54:27 crc kubenswrapper[4749]: I1129 02:54:27.981361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-588c68cbfd-bccbx" event={"ID":"1fda202d-bbc6-494b-89ff-e49cff899f83","Type":"ContainerStarted","Data":"6b6ec7b22d2cee0e4803f4799a86b7f30c462e5294ee304dcde86a3876f5ee05"} Nov 29 02:54:28 crc kubenswrapper[4749]: I1129 02:54:28.003255 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7849bb687c-m7rzj"] Nov 29 02:54:28 crc kubenswrapper[4749]: I1129 02:54:28.003507 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7849bb687c-m7rzj" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon-log" containerID="cri-o://6a42efdc71f1f8074001d9d55efefb62cbc0d003658c233d1b05bec482b538a4" gracePeriod=30 Nov 29 02:54:28 crc kubenswrapper[4749]: I1129 02:54:28.003641 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7849bb687c-m7rzj" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon" containerID="cri-o://3b972c0fc3f91c7775700cc247de4f2b7038f7a8f4c8d60bf70b4b5a61b20e21" gracePeriod=30 Nov 29 02:54:31 crc kubenswrapper[4749]: I1129 02:54:31.009872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-74ddcf9444-kzl5n" event={"ID":"79248744-29f0-43bd-b44a-a4b8c42aae39","Type":"ContainerStarted","Data":"32af68cf0bdd85943879b6f9ca74815e605984d7fd333edb90ab99a0e8093ebe"} Nov 29 02:54:31 crc kubenswrapper[4749]: I1129 02:54:31.011359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:31 crc kubenswrapper[4749]: I1129 02:54:31.015455 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-588c68cbfd-bccbx" event={"ID":"1fda202d-bbc6-494b-89ff-e49cff899f83","Type":"ContainerStarted","Data":"68a343cba120f5c71b626b590dd2a9d73ac4b6bc170de2c63bdb0de0e103568a"} Nov 29 02:54:31 crc kubenswrapper[4749]: I1129 02:54:31.015618 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:31 crc kubenswrapper[4749]: I1129 02:54:31.037614 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-74ddcf9444-kzl5n" podStartSLOduration=1.9591837970000001 podStartE2EDuration="5.037596914s" podCreationTimestamp="2025-11-29 02:54:26 +0000 UTC" firstStartedPulling="2025-11-29 02:54:27.141325098 +0000 UTC m=+6210.313474955" lastFinishedPulling="2025-11-29 02:54:30.219738215 +0000 UTC m=+6213.391888072" observedRunningTime="2025-11-29 02:54:31.03290358 +0000 UTC m=+6214.205053437" watchObservedRunningTime="2025-11-29 02:54:31.037596914 +0000 UTC m=+6214.209746761" Nov 29 02:54:31 crc kubenswrapper[4749]: I1129 02:54:31.040536 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5c7858b669-qhgk5" podStartSLOduration=5.040523975 podStartE2EDuration="5.040523975s" podCreationTimestamp="2025-11-29 02:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:54:28.016954338 +0000 UTC m=+6211.189104195" watchObservedRunningTime="2025-11-29 02:54:31.040523975 +0000 UTC m=+6214.212673832" Nov 29 02:54:31 crc kubenswrapper[4749]: I1129 02:54:31.055328 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-588c68cbfd-bccbx" podStartSLOduration=2.153519172 podStartE2EDuration="5.055312163s" podCreationTimestamp="2025-11-29 02:54:26 +0000 UTC" firstStartedPulling="2025-11-29 02:54:27.3206471 +0000 UTC m=+6210.492796957" lastFinishedPulling="2025-11-29 02:54:30.222440091 +0000 UTC m=+6213.394589948" observedRunningTime="2025-11-29 02:54:31.051500201 +0000 UTC m=+6214.223650068" watchObservedRunningTime="2025-11-29 02:54:31.055312163 +0000 UTC m=+6214.227462020" Nov 29 02:54:32 crc kubenswrapper[4749]: I1129 02:54:32.027907 4749 generic.go:334] "Generic (PLEG): container finished" podID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerID="3b972c0fc3f91c7775700cc247de4f2b7038f7a8f4c8d60bf70b4b5a61b20e21" exitCode=0 Nov 29 02:54:32 crc kubenswrapper[4749]: I1129 02:54:32.027984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7849bb687c-m7rzj" event={"ID":"57cb1c71-97d0-4742-be6b-4ce763c9b51d","Type":"ContainerDied","Data":"3b972c0fc3f91c7775700cc247de4f2b7038f7a8f4c8d60bf70b4b5a61b20e21"} Nov 29 02:54:34 crc kubenswrapper[4749]: I1129 02:54:34.057441 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2eba-account-create-update-jxl2n"] Nov 29 02:54:34 crc kubenswrapper[4749]: I1129 02:54:34.079027 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2eba-account-create-update-jxl2n"] Nov 29 02:54:34 crc kubenswrapper[4749]: I1129 02:54:34.088067 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pjk59"] Nov 29 02:54:34 crc kubenswrapper[4749]: I1129 02:54:34.098021 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pjk59"] Nov 29 02:54:35 crc kubenswrapper[4749]: I1129 02:54:35.087329 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794dc0b7-98bd-4b0b-8c79-19523f9057b9" path="/var/lib/kubelet/pods/794dc0b7-98bd-4b0b-8c79-19523f9057b9/volumes" Nov 29 02:54:35 crc kubenswrapper[4749]: I1129 02:54:35.088493 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7992f3f8-703c-4b6b-8ef1-4067d9972fc6" path="/var/lib/kubelet/pods/7992f3f8-703c-4b6b-8ef1-4067d9972fc6/volumes" Nov 29 02:54:36 crc kubenswrapper[4749]: I1129 02:54:36.621874 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7849bb687c-m7rzj" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 29 02:54:37 crc kubenswrapper[4749]: I1129 02:54:37.802283 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-74ddcf9444-kzl5n" Nov 29 02:54:38 crc kubenswrapper[4749]: I1129 02:54:38.210443 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-588c68cbfd-bccbx" Nov 29 02:54:42 crc kubenswrapper[4749]: I1129 02:54:42.061410 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-2mvvp"] Nov 29 02:54:42 crc kubenswrapper[4749]: I1129 02:54:42.073122 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-2mvvp"] Nov 29 02:54:43 crc kubenswrapper[4749]: I1129 02:54:43.085497 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38" path="/var/lib/kubelet/pods/fd7ecbdf-d422-43fd-ab04-9b32dbbb4b38/volumes" Nov 29 02:54:46 crc kubenswrapper[4749]: I1129 02:54:46.471964 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5c7858b669-qhgk5" Nov 29 02:54:46 crc kubenswrapper[4749]: I1129 02:54:46.621060 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7849bb687c-m7rzj" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 29 02:54:56 crc kubenswrapper[4749]: I1129 02:54:56.620991 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7849bb687c-m7rzj" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Nov 29 02:54:56 crc kubenswrapper[4749]: I1129 02:54:56.622035 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.733245 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5"] Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.736101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.738932 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.747935 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5"] Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.848871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvsck\" (UniqueName: \"kubernetes.io/projected/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-kube-api-access-gvsck\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.848976 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.849163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.952095 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvsck\" (UniqueName: \"kubernetes.io/projected/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-kube-api-access-gvsck\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.952228 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.952357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.952961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.953183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:57 crc kubenswrapper[4749]: I1129 02:54:57.979030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvsck\" (UniqueName: \"kubernetes.io/projected/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-kube-api-access-gvsck\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.104647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.331089 4749 generic.go:334] "Generic (PLEG): container finished" podID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerID="6a42efdc71f1f8074001d9d55efefb62cbc0d003658c233d1b05bec482b538a4" exitCode=137 Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.331163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7849bb687c-m7rzj" event={"ID":"57cb1c71-97d0-4742-be6b-4ce763c9b51d","Type":"ContainerDied","Data":"6a42efdc71f1f8074001d9d55efefb62cbc0d003658c233d1b05bec482b538a4"} Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.494104 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.563478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57cb1c71-97d0-4742-be6b-4ce763c9b51d-logs\") pod \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.563617 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t756l\" (UniqueName: \"kubernetes.io/projected/57cb1c71-97d0-4742-be6b-4ce763c9b51d-kube-api-access-t756l\") pod \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.563650 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/57cb1c71-97d0-4742-be6b-4ce763c9b51d-horizon-secret-key\") pod \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.563714 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-config-data\") pod \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.563863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57cb1c71-97d0-4742-be6b-4ce763c9b51d-logs" (OuterVolumeSpecName: "logs") pod "57cb1c71-97d0-4742-be6b-4ce763c9b51d" (UID: "57cb1c71-97d0-4742-be6b-4ce763c9b51d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.564379 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-scripts\") pod \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\" (UID: \"57cb1c71-97d0-4742-be6b-4ce763c9b51d\") " Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.565315 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57cb1c71-97d0-4742-be6b-4ce763c9b51d-logs\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.568899 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57cb1c71-97d0-4742-be6b-4ce763c9b51d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "57cb1c71-97d0-4742-be6b-4ce763c9b51d" (UID: "57cb1c71-97d0-4742-be6b-4ce763c9b51d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.570502 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57cb1c71-97d0-4742-be6b-4ce763c9b51d-kube-api-access-t756l" (OuterVolumeSpecName: "kube-api-access-t756l") pod "57cb1c71-97d0-4742-be6b-4ce763c9b51d" (UID: "57cb1c71-97d0-4742-be6b-4ce763c9b51d"). InnerVolumeSpecName "kube-api-access-t756l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.601987 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-scripts" (OuterVolumeSpecName: "scripts") pod "57cb1c71-97d0-4742-be6b-4ce763c9b51d" (UID: "57cb1c71-97d0-4742-be6b-4ce763c9b51d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.608708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-config-data" (OuterVolumeSpecName: "config-data") pod "57cb1c71-97d0-4742-be6b-4ce763c9b51d" (UID: "57cb1c71-97d0-4742-be6b-4ce763c9b51d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.667225 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t756l\" (UniqueName: \"kubernetes.io/projected/57cb1c71-97d0-4742-be6b-4ce763c9b51d-kube-api-access-t756l\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.667245 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/57cb1c71-97d0-4742-be6b-4ce763c9b51d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.667254 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.667263 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57cb1c71-97d0-4742-be6b-4ce763c9b51d-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:54:58 crc kubenswrapper[4749]: I1129 02:54:58.985733 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5"] Nov 29 02:54:58 crc kubenswrapper[4749]: W1129 02:54:58.991754 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b635b5_bb08_4a95_b27e_56d1ea63ffc3.slice/crio-901a38a77bcb3c1240ce70ea87d4d28f4c040224ec1b41b502def468c63c3785 WatchSource:0}: Error finding container 901a38a77bcb3c1240ce70ea87d4d28f4c040224ec1b41b502def468c63c3785: Status 404 returned error can't find the container with id 901a38a77bcb3c1240ce70ea87d4d28f4c040224ec1b41b502def468c63c3785 Nov 29 02:54:59 crc kubenswrapper[4749]: I1129 02:54:59.342389 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7849bb687c-m7rzj" Nov 29 02:54:59 crc kubenswrapper[4749]: I1129 02:54:59.342403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7849bb687c-m7rzj" event={"ID":"57cb1c71-97d0-4742-be6b-4ce763c9b51d","Type":"ContainerDied","Data":"3f60ac14f4a73af351a1acc42ab7985c4899a60598dbce58afe155de18b42469"} Nov 29 02:54:59 crc kubenswrapper[4749]: I1129 02:54:59.342782 4749 scope.go:117] "RemoveContainer" containerID="3b972c0fc3f91c7775700cc247de4f2b7038f7a8f4c8d60bf70b4b5a61b20e21" Nov 29 02:54:59 crc kubenswrapper[4749]: I1129 02:54:59.344979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" event={"ID":"41b635b5-bb08-4a95-b27e-56d1ea63ffc3","Type":"ContainerStarted","Data":"75a57e0c69303d1b099db751586ef4ea5a6a794bd2ce9b6fdb9c491fd4d5577d"} Nov 29 02:54:59 crc kubenswrapper[4749]: I1129 02:54:59.345013 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" event={"ID":"41b635b5-bb08-4a95-b27e-56d1ea63ffc3","Type":"ContainerStarted","Data":"901a38a77bcb3c1240ce70ea87d4d28f4c040224ec1b41b502def468c63c3785"} Nov 29 02:54:59 crc kubenswrapper[4749]: I1129 02:54:59.375082 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7849bb687c-m7rzj"] Nov 29 02:54:59 crc kubenswrapper[4749]: I1129 02:54:59.383853 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7849bb687c-m7rzj"] Nov 29 02:54:59 crc kubenswrapper[4749]: I1129 02:54:59.617334 4749 scope.go:117] "RemoveContainer" containerID="6a42efdc71f1f8074001d9d55efefb62cbc0d003658c233d1b05bec482b538a4" Nov 29 02:55:00 crc kubenswrapper[4749]: I1129 02:55:00.363607 4749 generic.go:334] "Generic (PLEG): container finished" podID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerID="75a57e0c69303d1b099db751586ef4ea5a6a794bd2ce9b6fdb9c491fd4d5577d" exitCode=0 Nov 29 02:55:00 crc kubenswrapper[4749]: I1129 02:55:00.363668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" event={"ID":"41b635b5-bb08-4a95-b27e-56d1ea63ffc3","Type":"ContainerDied","Data":"75a57e0c69303d1b099db751586ef4ea5a6a794bd2ce9b6fdb9c491fd4d5577d"} Nov 29 02:55:01 crc kubenswrapper[4749]: I1129 02:55:01.110711 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" path="/var/lib/kubelet/pods/57cb1c71-97d0-4742-be6b-4ce763c9b51d/volumes" Nov 29 02:55:03 crc kubenswrapper[4749]: I1129 02:55:03.410699 4749 generic.go:334] "Generic (PLEG): container finished" podID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerID="937f8e21a327788f4e5451a3065d9593434b65f737b84353e7e480289d1f68e4" exitCode=0 Nov 29 02:55:03 crc kubenswrapper[4749]: I1129 02:55:03.410765 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" event={"ID":"41b635b5-bb08-4a95-b27e-56d1ea63ffc3","Type":"ContainerDied","Data":"937f8e21a327788f4e5451a3065d9593434b65f737b84353e7e480289d1f68e4"} Nov 29 02:55:04 crc kubenswrapper[4749]: I1129 02:55:04.426464 4749 generic.go:334] "Generic (PLEG): container finished" podID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerID="e8ee6a16253a023edf02cebbffa306bb6293c03780a245634948329968735466" exitCode=0 Nov 29 02:55:04 crc kubenswrapper[4749]: I1129 02:55:04.426542 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" event={"ID":"41b635b5-bb08-4a95-b27e-56d1ea63ffc3","Type":"ContainerDied","Data":"e8ee6a16253a023edf02cebbffa306bb6293c03780a245634948329968735466"} Nov 29 02:55:05 crc kubenswrapper[4749]: I1129 02:55:05.919253 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:55:05 crc kubenswrapper[4749]: I1129 02:55:05.946086 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvsck\" (UniqueName: \"kubernetes.io/projected/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-kube-api-access-gvsck\") pod \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " Nov 29 02:55:05 crc kubenswrapper[4749]: I1129 02:55:05.947693 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-bundle\") pod \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " Nov 29 02:55:05 crc kubenswrapper[4749]: I1129 02:55:05.947871 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-util\") pod \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\" (UID: \"41b635b5-bb08-4a95-b27e-56d1ea63ffc3\") " Nov 29 02:55:05 crc kubenswrapper[4749]: I1129 02:55:05.950041 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-bundle" (OuterVolumeSpecName: "bundle") pod "41b635b5-bb08-4a95-b27e-56d1ea63ffc3" (UID: "41b635b5-bb08-4a95-b27e-56d1ea63ffc3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:55:05 crc kubenswrapper[4749]: I1129 02:55:05.958231 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-util" (OuterVolumeSpecName: "util") pod "41b635b5-bb08-4a95-b27e-56d1ea63ffc3" (UID: "41b635b5-bb08-4a95-b27e-56d1ea63ffc3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:55:05 crc kubenswrapper[4749]: I1129 02:55:05.961761 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-kube-api-access-gvsck" (OuterVolumeSpecName: "kube-api-access-gvsck") pod "41b635b5-bb08-4a95-b27e-56d1ea63ffc3" (UID: "41b635b5-bb08-4a95-b27e-56d1ea63ffc3"). InnerVolumeSpecName "kube-api-access-gvsck". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:55:06 crc kubenswrapper[4749]: I1129 02:55:06.053755 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvsck\" (UniqueName: \"kubernetes.io/projected/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-kube-api-access-gvsck\") on node \"crc\" DevicePath \"\"" Nov 29 02:55:06 crc kubenswrapper[4749]: I1129 02:55:06.054073 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:55:06 crc kubenswrapper[4749]: I1129 02:55:06.054092 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/41b635b5-bb08-4a95-b27e-56d1ea63ffc3-util\") on node \"crc\" DevicePath \"\"" Nov 29 02:55:06 crc kubenswrapper[4749]: I1129 02:55:06.476333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" event={"ID":"41b635b5-bb08-4a95-b27e-56d1ea63ffc3","Type":"ContainerDied","Data":"901a38a77bcb3c1240ce70ea87d4d28f4c040224ec1b41b502def468c63c3785"} Nov 29 02:55:06 crc kubenswrapper[4749]: I1129 02:55:06.476381 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901a38a77bcb3c1240ce70ea87d4d28f4c040224ec1b41b502def468c63c3785" Nov 29 02:55:06 crc kubenswrapper[4749]: I1129 02:55:06.476446 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5" Nov 29 02:55:14 crc kubenswrapper[4749]: I1129 02:55:14.097436 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-148b-account-create-update-xc8b2"] Nov 29 02:55:14 crc kubenswrapper[4749]: I1129 02:55:14.113625 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-j9x6q"] Nov 29 02:55:14 crc kubenswrapper[4749]: I1129 02:55:14.149161 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-148b-account-create-update-xc8b2"] Nov 29 02:55:14 crc kubenswrapper[4749]: I1129 02:55:14.169523 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-j9x6q"] Nov 29 02:55:15 crc kubenswrapper[4749]: I1129 02:55:15.085325 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e03156c-645e-4f64-8201-3343cc04f9d3" path="/var/lib/kubelet/pods/4e03156c-645e-4f64-8201-3343cc04f9d3/volumes" Nov 29 02:55:15 crc kubenswrapper[4749]: I1129 02:55:15.086579 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b1f9731-f71b-428d-bc77-dc3b56eeb7c5" path="/var/lib/kubelet/pods/7b1f9731-f71b-428d-bc77-dc3b56eeb7c5/volumes" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.382522 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp"] Nov 29 02:55:16 crc kubenswrapper[4749]: E1129 02:55:16.382890 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerName="pull" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.382903 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerName="pull" Nov 29 02:55:16 crc kubenswrapper[4749]: E1129 02:55:16.382933 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon-log" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.382938 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon-log" Nov 29 02:55:16 crc kubenswrapper[4749]: E1129 02:55:16.382950 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerName="util" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.382955 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerName="util" Nov 29 02:55:16 crc kubenswrapper[4749]: E1129 02:55:16.382967 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.382973 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon" Nov 29 02:55:16 crc kubenswrapper[4749]: E1129 02:55:16.382980 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerName="extract" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.382986 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerName="extract" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.383148 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.383168 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b635b5-bb08-4a95-b27e-56d1ea63ffc3" containerName="extract" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.383178 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="57cb1c71-97d0-4742-be6b-4ce763c9b51d" containerName="horizon-log" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.383806 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.388129 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.388512 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-cbrjm" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.388800 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.399772 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgdfp\" (UniqueName: \"kubernetes.io/projected/3605df88-08f3-4d94-bed9-34339855602a-kube-api-access-xgdfp\") pod \"obo-prometheus-operator-668cf9dfbb-2hprp\" (UID: \"3605df88-08f3-4d94-bed9-34339855602a\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.400365 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.503416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgdfp\" (UniqueName: \"kubernetes.io/projected/3605df88-08f3-4d94-bed9-34339855602a-kube-api-access-xgdfp\") pod \"obo-prometheus-operator-668cf9dfbb-2hprp\" (UID: \"3605df88-08f3-4d94-bed9-34339855602a\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.545274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgdfp\" (UniqueName: \"kubernetes.io/projected/3605df88-08f3-4d94-bed9-34339855602a-kube-api-access-xgdfp\") pod \"obo-prometheus-operator-668cf9dfbb-2hprp\" (UID: \"3605df88-08f3-4d94-bed9-34339855602a\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.556868 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.559692 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.563505 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.563762 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-cxttr" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.564271 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.565543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.578164 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.606333 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.607700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f261a062-257e-44a5-8f06-05d682c51638-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh\" (UID: \"f261a062-257e-44a5-8f06-05d682c51638\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.608317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9aae6a67-cff1-4a25-b1b9-eecf82756432-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f\" (UID: \"9aae6a67-cff1-4a25-b1b9-eecf82756432\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.608378 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f261a062-257e-44a5-8f06-05d682c51638-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh\" (UID: \"f261a062-257e-44a5-8f06-05d682c51638\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.608447 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9aae6a67-cff1-4a25-b1b9-eecf82756432-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f\" (UID: \"9aae6a67-cff1-4a25-b1b9-eecf82756432\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.707699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.709993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f261a062-257e-44a5-8f06-05d682c51638-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh\" (UID: \"f261a062-257e-44a5-8f06-05d682c51638\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.710077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9aae6a67-cff1-4a25-b1b9-eecf82756432-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f\" (UID: \"9aae6a67-cff1-4a25-b1b9-eecf82756432\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.710097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f261a062-257e-44a5-8f06-05d682c51638-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh\" (UID: \"f261a062-257e-44a5-8f06-05d682c51638\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.710147 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9aae6a67-cff1-4a25-b1b9-eecf82756432-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f\" (UID: \"9aae6a67-cff1-4a25-b1b9-eecf82756432\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.713856 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f261a062-257e-44a5-8f06-05d682c51638-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh\" (UID: \"f261a062-257e-44a5-8f06-05d682c51638\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.713874 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f261a062-257e-44a5-8f06-05d682c51638-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh\" (UID: \"f261a062-257e-44a5-8f06-05d682c51638\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.713903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9aae6a67-cff1-4a25-b1b9-eecf82756432-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f\" (UID: \"9aae6a67-cff1-4a25-b1b9-eecf82756432\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.719244 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9aae6a67-cff1-4a25-b1b9-eecf82756432-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f\" (UID: \"9aae6a67-cff1-4a25-b1b9-eecf82756432\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.761256 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-xchfm"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.762643 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.764772 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-jd9dq" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.768702 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.786987 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-xchfm"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.814093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdpnk\" (UniqueName: \"kubernetes.io/projected/9f2b11e6-22ff-4b26-a9d3-52241843dde7-kube-api-access-kdpnk\") pod \"observability-operator-d8bb48f5d-xchfm\" (UID: \"9f2b11e6-22ff-4b26-a9d3-52241843dde7\") " pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.814182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f2b11e6-22ff-4b26-a9d3-52241843dde7-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-xchfm\" (UID: \"9f2b11e6-22ff-4b26-a9d3-52241843dde7\") " pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.873078 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-rqm5f"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.874291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.877733 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-72l69" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.889449 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-rqm5f"] Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.925490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc37afd3-a5b8-42ba-9749-dd7435506f70-openshift-service-ca\") pod \"perses-operator-5446b9c989-rqm5f\" (UID: \"bc37afd3-a5b8-42ba-9749-dd7435506f70\") " pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.926069 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvcl\" (UniqueName: \"kubernetes.io/projected/bc37afd3-a5b8-42ba-9749-dd7435506f70-kube-api-access-2wvcl\") pod \"perses-operator-5446b9c989-rqm5f\" (UID: \"bc37afd3-a5b8-42ba-9749-dd7435506f70\") " pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.926210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdpnk\" (UniqueName: \"kubernetes.io/projected/9f2b11e6-22ff-4b26-a9d3-52241843dde7-kube-api-access-kdpnk\") pod \"observability-operator-d8bb48f5d-xchfm\" (UID: \"9f2b11e6-22ff-4b26-a9d3-52241843dde7\") " pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.926381 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f2b11e6-22ff-4b26-a9d3-52241843dde7-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-xchfm\" (UID: \"9f2b11e6-22ff-4b26-a9d3-52241843dde7\") " pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.927612 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.941779 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.960116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f2b11e6-22ff-4b26-a9d3-52241843dde7-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-xchfm\" (UID: \"9f2b11e6-22ff-4b26-a9d3-52241843dde7\") " pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:16 crc kubenswrapper[4749]: I1129 02:55:16.972899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdpnk\" (UniqueName: \"kubernetes.io/projected/9f2b11e6-22ff-4b26-a9d3-52241843dde7-kube-api-access-kdpnk\") pod \"observability-operator-d8bb48f5d-xchfm\" (UID: \"9f2b11e6-22ff-4b26-a9d3-52241843dde7\") " pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.030724 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc37afd3-a5b8-42ba-9749-dd7435506f70-openshift-service-ca\") pod \"perses-operator-5446b9c989-rqm5f\" (UID: \"bc37afd3-a5b8-42ba-9749-dd7435506f70\") " pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.030816 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvcl\" (UniqueName: \"kubernetes.io/projected/bc37afd3-a5b8-42ba-9749-dd7435506f70-kube-api-access-2wvcl\") pod \"perses-operator-5446b9c989-rqm5f\" (UID: \"bc37afd3-a5b8-42ba-9749-dd7435506f70\") " pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.042278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bc37afd3-a5b8-42ba-9749-dd7435506f70-openshift-service-ca\") pod \"perses-operator-5446b9c989-rqm5f\" (UID: \"bc37afd3-a5b8-42ba-9749-dd7435506f70\") " pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.051891 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvcl\" (UniqueName: \"kubernetes.io/projected/bc37afd3-a5b8-42ba-9749-dd7435506f70-kube-api-access-2wvcl\") pod \"perses-operator-5446b9c989-rqm5f\" (UID: \"bc37afd3-a5b8-42ba-9749-dd7435506f70\") " pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.150176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.189592 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.368973 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp"] Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.582217 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh"] Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.634592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp" event={"ID":"3605df88-08f3-4d94-bed9-34339855602a","Type":"ContainerStarted","Data":"c7973ad175239ed4cca3991c847022e7fbcd432c08e552b95f9ae66c8601d770"} Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.638569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" event={"ID":"f261a062-257e-44a5-8f06-05d682c51638","Type":"ContainerStarted","Data":"516c4a93f2ae1061537b16603985d6a5d76ed1a50c109992f64ae46fe9cdb1d1"} Nov 29 02:55:17 crc kubenswrapper[4749]: W1129 02:55:17.728711 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aae6a67_cff1_4a25_b1b9_eecf82756432.slice/crio-8d201b13e7e71da549dfcecc4e3aa823dc82f10e18df1d681162d43f0ebbf347 WatchSource:0}: Error finding container 8d201b13e7e71da549dfcecc4e3aa823dc82f10e18df1d681162d43f0ebbf347: Status 404 returned error can't find the container with id 8d201b13e7e71da549dfcecc4e3aa823dc82f10e18df1d681162d43f0ebbf347 Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.751170 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f"] Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.864187 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-rqm5f"] Nov 29 02:55:17 crc kubenswrapper[4749]: W1129 02:55:17.871095 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc37afd3_a5b8_42ba_9749_dd7435506f70.slice/crio-7440ffb2c16dfb4bdc0a4d13a0467cee7972fba1a6df7e74f974a1d5928f982f WatchSource:0}: Error finding container 7440ffb2c16dfb4bdc0a4d13a0467cee7972fba1a6df7e74f974a1d5928f982f: Status 404 returned error can't find the container with id 7440ffb2c16dfb4bdc0a4d13a0467cee7972fba1a6df7e74f974a1d5928f982f Nov 29 02:55:17 crc kubenswrapper[4749]: I1129 02:55:17.993515 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-xchfm"] Nov 29 02:55:18 crc kubenswrapper[4749]: I1129 02:55:18.649463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" event={"ID":"9aae6a67-cff1-4a25-b1b9-eecf82756432","Type":"ContainerStarted","Data":"8d201b13e7e71da549dfcecc4e3aa823dc82f10e18df1d681162d43f0ebbf347"} Nov 29 02:55:18 crc kubenswrapper[4749]: I1129 02:55:18.651657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-rqm5f" event={"ID":"bc37afd3-a5b8-42ba-9749-dd7435506f70","Type":"ContainerStarted","Data":"7440ffb2c16dfb4bdc0a4d13a0467cee7972fba1a6df7e74f974a1d5928f982f"} Nov 29 02:55:18 crc kubenswrapper[4749]: I1129 02:55:18.652573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" event={"ID":"9f2b11e6-22ff-4b26-a9d3-52241843dde7","Type":"ContainerStarted","Data":"fd3e35b7775b0227f55acef67b0c1311b5445001b26ac189360935fc33d30fb7"} Nov 29 02:55:20 crc kubenswrapper[4749]: I1129 02:55:20.040334 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-v6nvh"] Nov 29 02:55:20 crc kubenswrapper[4749]: I1129 02:55:20.050530 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-v6nvh"] Nov 29 02:55:20 crc kubenswrapper[4749]: I1129 02:55:20.360301 4749 scope.go:117] "RemoveContainer" containerID="3ce065b02ede3505b2dde58138e4310a87b945e2a9ffc2213fb48f573ed255bd" Nov 29 02:55:20 crc kubenswrapper[4749]: I1129 02:55:20.472403 4749 scope.go:117] "RemoveContainer" containerID="1604cc0ca142cede583092b141edd470bd06006fd9234bfcc160f956b6c1fbe4" Nov 29 02:55:21 crc kubenswrapper[4749]: I1129 02:55:21.129410 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d34f843-bfea-4a0c-ac09-3d6da8ac941d" path="/var/lib/kubelet/pods/7d34f843-bfea-4a0c-ac09-3d6da8ac941d/volumes" Nov 29 02:55:21 crc kubenswrapper[4749]: I1129 02:55:21.210350 4749 scope.go:117] "RemoveContainer" containerID="ec1124e6ca2676848d0bc99bf74118baa8c8c5569dd02d33d4de4b3b845de850" Nov 29 02:55:22 crc kubenswrapper[4749]: I1129 02:55:22.256830 4749 scope.go:117] "RemoveContainer" containerID="6f8b989eb08b24efbf5ffb9d3cd95cea8f648225560eb6c8b6bd8e9bec4f7c75" Nov 29 02:55:22 crc kubenswrapper[4749]: I1129 02:55:22.396789 4749 scope.go:117] "RemoveContainer" containerID="258fc7cccacaab4e83383ea19c59acaac2e0bdf117b2cb2c4b6c94702178cf19" Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.722760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp" event={"ID":"3605df88-08f3-4d94-bed9-34339855602a","Type":"ContainerStarted","Data":"c815b2bb760a8c79b14d5946ee36befd1e44538b3bb58409f4f2c7b10f7957f6"} Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.734560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" event={"ID":"f261a062-257e-44a5-8f06-05d682c51638","Type":"ContainerStarted","Data":"70f0bbb36cbf333ef376bedd8de95f65896cc4042996fbd4b5459bfd4a18674a"} Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.736600 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" event={"ID":"9aae6a67-cff1-4a25-b1b9-eecf82756432","Type":"ContainerStarted","Data":"4ede8bfe00f9a6fd03ba8ae614d0238040764f8d79b72fec3ebef6b413e5de5e"} Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.740772 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-rqm5f" event={"ID":"bc37afd3-a5b8-42ba-9749-dd7435506f70","Type":"ContainerStarted","Data":"144fb3807081ad6f29067c9b0b2c93e774f45f3d4f07d01d7a8dd7ca268af374"} Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.741518 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.744743 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-2hprp" podStartSLOduration=2.602167508 podStartE2EDuration="7.744723938s" podCreationTimestamp="2025-11-29 02:55:16 +0000 UTC" firstStartedPulling="2025-11-29 02:55:17.384301742 +0000 UTC m=+6260.556451599" lastFinishedPulling="2025-11-29 02:55:22.526858172 +0000 UTC m=+6265.699008029" observedRunningTime="2025-11-29 02:55:23.739774498 +0000 UTC m=+6266.911924365" watchObservedRunningTime="2025-11-29 02:55:23.744723938 +0000 UTC m=+6266.916873795" Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.812112 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh" podStartSLOduration=3.156280405 podStartE2EDuration="7.812093693s" podCreationTimestamp="2025-11-29 02:55:16 +0000 UTC" firstStartedPulling="2025-11-29 02:55:17.601527923 +0000 UTC m=+6260.773677780" lastFinishedPulling="2025-11-29 02:55:22.257341211 +0000 UTC m=+6265.429491068" observedRunningTime="2025-11-29 02:55:23.800473831 +0000 UTC m=+6266.972623698" watchObservedRunningTime="2025-11-29 02:55:23.812093693 +0000 UTC m=+6266.984243550" Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.882007 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f" podStartSLOduration=3.219927098 podStartE2EDuration="7.881991548s" podCreationTimestamp="2025-11-29 02:55:16 +0000 UTC" firstStartedPulling="2025-11-29 02:55:17.741334516 +0000 UTC m=+6260.913484373" lastFinishedPulling="2025-11-29 02:55:22.403398966 +0000 UTC m=+6265.575548823" observedRunningTime="2025-11-29 02:55:23.880587584 +0000 UTC m=+6267.052737441" watchObservedRunningTime="2025-11-29 02:55:23.881991548 +0000 UTC m=+6267.054141405" Nov 29 02:55:23 crc kubenswrapper[4749]: I1129 02:55:23.949848 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-rqm5f" podStartSLOduration=3.429937384 podStartE2EDuration="7.949826414s" podCreationTimestamp="2025-11-29 02:55:16 +0000 UTC" firstStartedPulling="2025-11-29 02:55:17.875947133 +0000 UTC m=+6261.048096990" lastFinishedPulling="2025-11-29 02:55:22.395836173 +0000 UTC m=+6265.567986020" observedRunningTime="2025-11-29 02:55:23.911578476 +0000 UTC m=+6267.083728333" watchObservedRunningTime="2025-11-29 02:55:23.949826414 +0000 UTC m=+6267.121976271" Nov 29 02:55:27 crc kubenswrapper[4749]: I1129 02:55:27.192662 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-rqm5f" Nov 29 02:55:27 crc kubenswrapper[4749]: I1129 02:55:27.778847 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" event={"ID":"9f2b11e6-22ff-4b26-a9d3-52241843dde7","Type":"ContainerStarted","Data":"8f847257e46a56f5c24cc418a2121a6524c38d2aab5152bc712bdd85fe3c0eba"} Nov 29 02:55:27 crc kubenswrapper[4749]: I1129 02:55:27.780560 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:27 crc kubenswrapper[4749]: I1129 02:55:27.807120 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" podStartSLOduration=2.704088691 podStartE2EDuration="11.807103754s" podCreationTimestamp="2025-11-29 02:55:16 +0000 UTC" firstStartedPulling="2025-11-29 02:55:17.989085749 +0000 UTC m=+6261.161235606" lastFinishedPulling="2025-11-29 02:55:27.092100812 +0000 UTC m=+6270.264250669" observedRunningTime="2025-11-29 02:55:27.798481724 +0000 UTC m=+6270.970631581" watchObservedRunningTime="2025-11-29 02:55:27.807103754 +0000 UTC m=+6270.979253611" Nov 29 02:55:27 crc kubenswrapper[4749]: I1129 02:55:27.819798 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-xchfm" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.374031 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.374669 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="6ebecb6a-b8de-43fa-b228-7008102bd827" containerName="openstackclient" containerID="cri-o://85fa615830a4186f7b1e92991aba88cdc6a2c2bced5e044e32b8eb9fb8d184dc" gracePeriod=2 Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.390302 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.442429 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 29 02:55:30 crc kubenswrapper[4749]: E1129 02:55:30.442835 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebecb6a-b8de-43fa-b228-7008102bd827" containerName="openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.442852 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebecb6a-b8de-43fa-b228-7008102bd827" containerName="openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.443020 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebecb6a-b8de-43fa-b228-7008102bd827" containerName="openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.443712 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.447939 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6ebecb6a-b8de-43fa-b228-7008102bd827" podUID="a76d2e70-2758-463e-b25f-6cd80067450a" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.457747 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsmmt\" (UniqueName: \"kubernetes.io/projected/a76d2e70-2758-463e-b25f-6cd80067450a-kube-api-access-hsmmt\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.457788 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a76d2e70-2758-463e-b25f-6cd80067450a-openstack-config\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.457881 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a76d2e70-2758-463e-b25f-6cd80067450a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.531658 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.561376 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsmmt\" (UniqueName: \"kubernetes.io/projected/a76d2e70-2758-463e-b25f-6cd80067450a-kube-api-access-hsmmt\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.561418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a76d2e70-2758-463e-b25f-6cd80067450a-openstack-config\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.561505 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a76d2e70-2758-463e-b25f-6cd80067450a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.565175 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a76d2e70-2758-463e-b25f-6cd80067450a-openstack-config\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.566803 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a76d2e70-2758-463e-b25f-6cd80067450a-openstack-config-secret\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.626918 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsmmt\" (UniqueName: \"kubernetes.io/projected/a76d2e70-2758-463e-b25f-6cd80067450a-kube-api-access-hsmmt\") pod \"openstackclient\" (UID: \"a76d2e70-2758-463e-b25f-6cd80067450a\") " pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.681871 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.694924 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.699269 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jq7c9" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.724781 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.772798 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.870950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nc9b\" (UniqueName: \"kubernetes.io/projected/55eaf704-2521-4487-a458-f38da05c48fc-kube-api-access-7nc9b\") pod \"kube-state-metrics-0\" (UID: \"55eaf704-2521-4487-a458-f38da05c48fc\") " pod="openstack/kube-state-metrics-0" Nov 29 02:55:30 crc kubenswrapper[4749]: I1129 02:55:30.972369 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nc9b\" (UniqueName: \"kubernetes.io/projected/55eaf704-2521-4487-a458-f38da05c48fc-kube-api-access-7nc9b\") pod \"kube-state-metrics-0\" (UID: \"55eaf704-2521-4487-a458-f38da05c48fc\") " pod="openstack/kube-state-metrics-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.022754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nc9b\" (UniqueName: \"kubernetes.io/projected/55eaf704-2521-4487-a458-f38da05c48fc-kube-api-access-7nc9b\") pod \"kube-state-metrics-0\" (UID: \"55eaf704-2521-4487-a458-f38da05c48fc\") " pod="openstack/kube-state-metrics-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.023442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.433475 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.464273 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.464414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.474235 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.474413 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.474535 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.474713 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-4qf8b" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.474817 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.496453 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.496535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44380984-9564-4578-8a96-024ee3db589a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.496559 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/44380984-9564-4578-8a96-024ee3db589a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.496666 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56cdw\" (UniqueName: \"kubernetes.io/projected/44380984-9564-4578-8a96-024ee3db589a-kube-api-access-56cdw\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.496701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.496718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44380984-9564-4578-8a96-024ee3db589a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.496732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.504629 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.603316 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56cdw\" (UniqueName: \"kubernetes.io/projected/44380984-9564-4578-8a96-024ee3db589a-kube-api-access-56cdw\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.603379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.603414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44380984-9564-4578-8a96-024ee3db589a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.603434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.603484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.603552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44380984-9564-4578-8a96-024ee3db589a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.603581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/44380984-9564-4578-8a96-024ee3db589a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.604241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/44380984-9564-4578-8a96-024ee3db589a-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.630796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/44380984-9564-4578-8a96-024ee3db589a-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.630897 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.631333 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.631539 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/44380984-9564-4578-8a96-024ee3db589a-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.633798 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/44380984-9564-4578-8a96-024ee3db589a-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.683932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56cdw\" (UniqueName: \"kubernetes.io/projected/44380984-9564-4578-8a96-024ee3db589a-kube-api-access-56cdw\") pod \"alertmanager-metric-storage-0\" (UID: \"44380984-9564-4578-8a96-024ee3db589a\") " pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.809669 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:31 crc kubenswrapper[4749]: I1129 02:55:31.840370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a76d2e70-2758-463e-b25f-6cd80067450a","Type":"ContainerStarted","Data":"9a867d582c00a9ecb83b15dfd0c88aa4a3b400cb4c4e8ce874cd3c6c2eb741de"} Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.337359 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.339974 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.345566 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.346328 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.346431 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.346560 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.346707 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-r46q5" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.355214 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.590608 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-482e736b-dab7-4d32-8117-1a410385b56c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-482e736b-dab7-4d32-8117-1a410385b56c\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.590906 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.590954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.591022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4bc6897f-040b-48d5-ad08-eec6d6f8f671-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.591139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.603276 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.606921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4bc6897f-040b-48d5-ad08-eec6d6f8f671-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.606997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4bc6897f-040b-48d5-ad08-eec6d6f8f671-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.607092 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwnxh\" (UniqueName: \"kubernetes.io/projected/4bc6897f-040b-48d5-ad08-eec6d6f8f671-kube-api-access-xwnxh\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.709385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-482e736b-dab7-4d32-8117-1a410385b56c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-482e736b-dab7-4d32-8117-1a410385b56c\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.709448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.709477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.709511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4bc6897f-040b-48d5-ad08-eec6d6f8f671-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.709559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.709598 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4bc6897f-040b-48d5-ad08-eec6d6f8f671-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.709614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4bc6897f-040b-48d5-ad08-eec6d6f8f671-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.709641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwnxh\" (UniqueName: \"kubernetes.io/projected/4bc6897f-040b-48d5-ad08-eec6d6f8f671-kube-api-access-xwnxh\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.715486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4bc6897f-040b-48d5-ad08-eec6d6f8f671-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.720903 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4bc6897f-040b-48d5-ad08-eec6d6f8f671-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.723397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4bc6897f-040b-48d5-ad08-eec6d6f8f671-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.725578 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.731801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.731873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4bc6897f-040b-48d5-ad08-eec6d6f8f671-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.733345 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.759886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwnxh\" (UniqueName: \"kubernetes.io/projected/4bc6897f-040b-48d5-ad08-eec6d6f8f671-kube-api-access-xwnxh\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.781797 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.781839 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-482e736b-dab7-4d32-8117-1a410385b56c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-482e736b-dab7-4d32-8117-1a410385b56c\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d42d415acc2634dc047ee99be3ca2f8fd7b0b765e013bc11fb1b638f1877170/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.863497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"55eaf704-2521-4487-a458-f38da05c48fc","Type":"ContainerStarted","Data":"f52f03c46d186fbd80659b8531dcd5be336f7d994456ce7ecea93b7ffa424bca"} Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.865938 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a76d2e70-2758-463e-b25f-6cd80067450a","Type":"ContainerStarted","Data":"d284829e98bdf956313fc9a1a6e7a27777552dcf2510c403c3e8bdc81ddec6f7"} Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.895459 4749 generic.go:334] "Generic (PLEG): container finished" podID="6ebecb6a-b8de-43fa-b228-7008102bd827" containerID="85fa615830a4186f7b1e92991aba88cdc6a2c2bced5e044e32b8eb9fb8d184dc" exitCode=137 Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.919373 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 29 02:55:32 crc kubenswrapper[4749]: I1129 02:55:32.922703 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.922666689 podStartE2EDuration="2.922666689s" podCreationTimestamp="2025-11-29 02:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:55:32.905815 +0000 UTC m=+6276.077964857" watchObservedRunningTime="2025-11-29 02:55:32.922666689 +0000 UTC m=+6276.094816566" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.001688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-482e736b-dab7-4d32-8117-1a410385b56c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-482e736b-dab7-4d32-8117-1a410385b56c\") pod \"prometheus-metric-storage-0\" (UID: \"4bc6897f-040b-48d5-ad08-eec6d6f8f671\") " pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.016519 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.438795 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.533272 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config-secret\") pod \"6ebecb6a-b8de-43fa-b228-7008102bd827\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.533408 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config\") pod \"6ebecb6a-b8de-43fa-b228-7008102bd827\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.533551 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlnm8\" (UniqueName: \"kubernetes.io/projected/6ebecb6a-b8de-43fa-b228-7008102bd827-kube-api-access-hlnm8\") pod \"6ebecb6a-b8de-43fa-b228-7008102bd827\" (UID: \"6ebecb6a-b8de-43fa-b228-7008102bd827\") " Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.542872 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebecb6a-b8de-43fa-b228-7008102bd827-kube-api-access-hlnm8" (OuterVolumeSpecName: "kube-api-access-hlnm8") pod "6ebecb6a-b8de-43fa-b228-7008102bd827" (UID: "6ebecb6a-b8de-43fa-b228-7008102bd827"). InnerVolumeSpecName "kube-api-access-hlnm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.565869 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.580050 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6ebecb6a-b8de-43fa-b228-7008102bd827" (UID: "6ebecb6a-b8de-43fa-b228-7008102bd827"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.599990 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6ebecb6a-b8de-43fa-b228-7008102bd827" (UID: "6ebecb6a-b8de-43fa-b228-7008102bd827"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.635879 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.636180 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlnm8\" (UniqueName: \"kubernetes.io/projected/6ebecb6a-b8de-43fa-b228-7008102bd827-kube-api-access-hlnm8\") on node \"crc\" DevicePath \"\"" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.636198 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ebecb6a-b8de-43fa-b228-7008102bd827-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.912342 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bc6897f-040b-48d5-ad08-eec6d6f8f671","Type":"ContainerStarted","Data":"1dfbbb576ecb490873e2ce2d5e9d46f3bbe79f81cfeda3f4e20bab09b9ce9229"} Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.913366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"44380984-9564-4578-8a96-024ee3db589a","Type":"ContainerStarted","Data":"8c87fd94eea6e063b2f2162b846b9c0a0b5dfbf357c13731a53dc0180acd1ba9"} Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.914608 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.915241 4749 scope.go:117] "RemoveContainer" containerID="85fa615830a4186f7b1e92991aba88cdc6a2c2bced5e044e32b8eb9fb8d184dc" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.928276 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"55eaf704-2521-4487-a458-f38da05c48fc","Type":"ContainerStarted","Data":"5ee1cc5b0b8ef62a9feea3756cbfe1d2c8bfa2227f32405abe5f5625566ee175"} Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.928453 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.951705 4749 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6ebecb6a-b8de-43fa-b228-7008102bd827" podUID="a76d2e70-2758-463e-b25f-6cd80067450a" Nov 29 02:55:33 crc kubenswrapper[4749]: I1129 02:55:33.959111 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.306840983 podStartE2EDuration="3.959093782s" podCreationTimestamp="2025-11-29 02:55:30 +0000 UTC" firstStartedPulling="2025-11-29 02:55:32.634838414 +0000 UTC m=+6275.806988271" lastFinishedPulling="2025-11-29 02:55:33.287091213 +0000 UTC m=+6276.459241070" observedRunningTime="2025-11-29 02:55:33.945518362 +0000 UTC m=+6277.117668219" watchObservedRunningTime="2025-11-29 02:55:33.959093782 +0000 UTC m=+6277.131243639" Nov 29 02:55:35 crc kubenswrapper[4749]: I1129 02:55:35.094003 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ebecb6a-b8de-43fa-b228-7008102bd827" path="/var/lib/kubelet/pods/6ebecb6a-b8de-43fa-b228-7008102bd827/volumes" Nov 29 02:55:39 crc kubenswrapper[4749]: I1129 02:55:39.996909 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"44380984-9564-4578-8a96-024ee3db589a","Type":"ContainerStarted","Data":"68fa8f2c9fd94b0d5c9502a547ce7fd38417bb1b930ef69cb31cef97672a4535"} Nov 29 02:55:39 crc kubenswrapper[4749]: I1129 02:55:39.999048 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bc6897f-040b-48d5-ad08-eec6d6f8f671","Type":"ContainerStarted","Data":"7bb39d8e848689bafbb0dd18ebd643746afed6ea76a184dc514ae761a3a407c5"} Nov 29 02:55:41 crc kubenswrapper[4749]: I1129 02:55:41.027938 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 29 02:55:46 crc kubenswrapper[4749]: I1129 02:55:46.063682 4749 generic.go:334] "Generic (PLEG): container finished" podID="4bc6897f-040b-48d5-ad08-eec6d6f8f671" containerID="7bb39d8e848689bafbb0dd18ebd643746afed6ea76a184dc514ae761a3a407c5" exitCode=0 Nov 29 02:55:46 crc kubenswrapper[4749]: I1129 02:55:46.063846 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bc6897f-040b-48d5-ad08-eec6d6f8f671","Type":"ContainerDied","Data":"7bb39d8e848689bafbb0dd18ebd643746afed6ea76a184dc514ae761a3a407c5"} Nov 29 02:55:46 crc kubenswrapper[4749]: I1129 02:55:46.069004 4749 generic.go:334] "Generic (PLEG): container finished" podID="44380984-9564-4578-8a96-024ee3db589a" containerID="68fa8f2c9fd94b0d5c9502a547ce7fd38417bb1b930ef69cb31cef97672a4535" exitCode=0 Nov 29 02:55:46 crc kubenswrapper[4749]: I1129 02:55:46.069071 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"44380984-9564-4578-8a96-024ee3db589a","Type":"ContainerDied","Data":"68fa8f2c9fd94b0d5c9502a547ce7fd38417bb1b930ef69cb31cef97672a4535"} Nov 29 02:55:49 crc kubenswrapper[4749]: I1129 02:55:49.117375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"44380984-9564-4578-8a96-024ee3db589a","Type":"ContainerStarted","Data":"6e753af3aea7f3143746998f0f74b5a83fea1c24401021e44e89f948c035e62a"} Nov 29 02:55:53 crc kubenswrapper[4749]: I1129 02:55:53.166219 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bc6897f-040b-48d5-ad08-eec6d6f8f671","Type":"ContainerStarted","Data":"8e1341389142cfe069c09f75fec55029bd086731d6844766c9d14bd08bccac86"} Nov 29 02:55:53 crc kubenswrapper[4749]: I1129 02:55:53.170950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"44380984-9564-4578-8a96-024ee3db589a","Type":"ContainerStarted","Data":"0394b008bce84716807b305f2a4ef1c902d22d2f1fea791662277cab4e2190e3"} Nov 29 02:55:53 crc kubenswrapper[4749]: I1129 02:55:53.171435 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:53 crc kubenswrapper[4749]: I1129 02:55:53.177666 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Nov 29 02:55:53 crc kubenswrapper[4749]: I1129 02:55:53.207513 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.327596667 podStartE2EDuration="22.207498034s" podCreationTimestamp="2025-11-29 02:55:31 +0000 UTC" firstStartedPulling="2025-11-29 02:55:32.930657553 +0000 UTC m=+6276.102807410" lastFinishedPulling="2025-11-29 02:55:48.81055892 +0000 UTC m=+6291.982708777" observedRunningTime="2025-11-29 02:55:53.193832083 +0000 UTC m=+6296.365981950" watchObservedRunningTime="2025-11-29 02:55:53.207498034 +0000 UTC m=+6296.379647891" Nov 29 02:55:55 crc kubenswrapper[4749]: I1129 02:55:55.854296 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 02:55:56 crc kubenswrapper[4749]: I1129 02:55:56.218731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bc6897f-040b-48d5-ad08-eec6d6f8f671","Type":"ContainerStarted","Data":"a81375fc9dfe6a4ec71fac7de9dfde4ab00fff5b7a8c8fe9248b34dd14eec792"} Nov 29 02:56:02 crc kubenswrapper[4749]: I1129 02:56:02.296268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"4bc6897f-040b-48d5-ad08-eec6d6f8f671","Type":"ContainerStarted","Data":"8d81a6f4986fbfef967a1475473f4abd378fc9e56ea78430e4564e4ada730566"} Nov 29 02:56:02 crc kubenswrapper[4749]: I1129 02:56:02.366061 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.276706259 podStartE2EDuration="31.366029615s" podCreationTimestamp="2025-11-29 02:55:31 +0000 UTC" firstStartedPulling="2025-11-29 02:55:33.580643737 +0000 UTC m=+6276.752793594" lastFinishedPulling="2025-11-29 02:56:01.669967093 +0000 UTC m=+6304.842116950" observedRunningTime="2025-11-29 02:56:02.350901848 +0000 UTC m=+6305.523051745" watchObservedRunningTime="2025-11-29 02:56:02.366029615 +0000 UTC m=+6305.538179552" Nov 29 02:56:03 crc kubenswrapper[4749]: I1129 02:56:03.017810 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 29 02:56:03 crc kubenswrapper[4749]: I1129 02:56:03.018191 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 29 02:56:03 crc kubenswrapper[4749]: I1129 02:56:03.021720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 29 02:56:03 crc kubenswrapper[4749]: I1129 02:56:03.309660 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 29 02:56:05 crc kubenswrapper[4749]: I1129 02:56:05.934491 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:05 crc kubenswrapper[4749]: I1129 02:56:05.944210 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:56:05 crc kubenswrapper[4749]: I1129 02:56:05.950109 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 02:56:05 crc kubenswrapper[4749]: I1129 02:56:05.955893 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 02:56:05 crc kubenswrapper[4749]: I1129 02:56:05.966028 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.024001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.024108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-run-httpd\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.024219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-config-data\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.025338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-log-httpd\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.025533 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkxq6\" (UniqueName: \"kubernetes.io/projected/52dda423-f967-4f93-8b4b-90f6b130ea98-kube-api-access-pkxq6\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.025582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-scripts\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.025737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.127097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.127179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.127254 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-run-httpd\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.127298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-config-data\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.127352 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-log-httpd\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.127431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkxq6\" (UniqueName: \"kubernetes.io/projected/52dda423-f967-4f93-8b4b-90f6b130ea98-kube-api-access-pkxq6\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.127455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-scripts\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.128617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-log-httpd\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.128723 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-run-httpd\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.134928 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.138567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-config-data\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.138772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-scripts\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.147939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.148697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkxq6\" (UniqueName: \"kubernetes.io/projected/52dda423-f967-4f93-8b4b-90f6b130ea98-kube-api-access-pkxq6\") pod \"ceilometer-0\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.274452 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:56:06 crc kubenswrapper[4749]: I1129 02:56:06.750027 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:07 crc kubenswrapper[4749]: I1129 02:56:07.357574 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerStarted","Data":"b9d5e62ef3a0ec7a2228ac4b6ff7c4a08189fe2a22627309d3d8d858bf96db08"} Nov 29 02:56:08 crc kubenswrapper[4749]: I1129 02:56:08.379957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerStarted","Data":"19f1bb570219e384c0b086ef6f7d9eb806a02574c2ccc627e647b48a6a809ad9"} Nov 29 02:56:09 crc kubenswrapper[4749]: I1129 02:56:09.390440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerStarted","Data":"f3f8f05a44984ecd1d427d13ca004f02a81b0ad058a17e4468ecba4ac8d429bf"} Nov 29 02:56:10 crc kubenswrapper[4749]: I1129 02:56:10.401796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerStarted","Data":"ba99f5ee9386ea3c4abb3a01a964b876a74380b9523ffa0286363aa8e094612a"} Nov 29 02:56:11 crc kubenswrapper[4749]: I1129 02:56:11.430385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerStarted","Data":"fbc2e75d6ab9a5f7f585c4986ee37b0078c1050d45562a709893ad49ddf577a0"} Nov 29 02:56:11 crc kubenswrapper[4749]: I1129 02:56:11.430824 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 02:56:11 crc kubenswrapper[4749]: I1129 02:56:11.470745 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.105934676 podStartE2EDuration="6.47072421s" podCreationTimestamp="2025-11-29 02:56:05 +0000 UTC" firstStartedPulling="2025-11-29 02:56:06.757822067 +0000 UTC m=+6309.929971914" lastFinishedPulling="2025-11-29 02:56:11.122611581 +0000 UTC m=+6314.294761448" observedRunningTime="2025-11-29 02:56:11.463451783 +0000 UTC m=+6314.635601660" watchObservedRunningTime="2025-11-29 02:56:11.47072421 +0000 UTC m=+6314.642874077" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.018226 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-nkzq6"] Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.019835 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.068503 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc7wz\" (UniqueName: \"kubernetes.io/projected/2831df11-ac96-49f4-9669-9c549e61a190-kube-api-access-jc7wz\") pod \"aodh-db-create-nkzq6\" (UID: \"2831df11-ac96-49f4-9669-9c549e61a190\") " pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.068570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2831df11-ac96-49f4-9669-9c549e61a190-operator-scripts\") pod \"aodh-db-create-nkzq6\" (UID: \"2831df11-ac96-49f4-9669-9c549e61a190\") " pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.088649 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-nkzq6"] Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.114829 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-8twmk"] Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.122026 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-8twmk"] Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.137635 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-32db-account-create-update-k6bvn"] Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.139278 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.149591 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.164421 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-32db-account-create-update-k6bvn"] Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.171158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc7wz\" (UniqueName: \"kubernetes.io/projected/2831df11-ac96-49f4-9669-9c549e61a190-kube-api-access-jc7wz\") pod \"aodh-db-create-nkzq6\" (UID: \"2831df11-ac96-49f4-9669-9c549e61a190\") " pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.171394 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2831df11-ac96-49f4-9669-9c549e61a190-operator-scripts\") pod \"aodh-db-create-nkzq6\" (UID: \"2831df11-ac96-49f4-9669-9c549e61a190\") " pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.173222 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2831df11-ac96-49f4-9669-9c549e61a190-operator-scripts\") pod \"aodh-db-create-nkzq6\" (UID: \"2831df11-ac96-49f4-9669-9c549e61a190\") " pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.214541 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc7wz\" (UniqueName: \"kubernetes.io/projected/2831df11-ac96-49f4-9669-9c549e61a190-kube-api-access-jc7wz\") pod \"aodh-db-create-nkzq6\" (UID: \"2831df11-ac96-49f4-9669-9c549e61a190\") " pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.274089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1e0df9-b788-4227-a867-61420e8494f5-operator-scripts\") pod \"aodh-32db-account-create-update-k6bvn\" (UID: \"7e1e0df9-b788-4227-a867-61420e8494f5\") " pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.274553 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsx2h\" (UniqueName: \"kubernetes.io/projected/7e1e0df9-b788-4227-a867-61420e8494f5-kube-api-access-fsx2h\") pod \"aodh-32db-account-create-update-k6bvn\" (UID: \"7e1e0df9-b788-4227-a867-61420e8494f5\") " pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.371715 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.376971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1e0df9-b788-4227-a867-61420e8494f5-operator-scripts\") pod \"aodh-32db-account-create-update-k6bvn\" (UID: \"7e1e0df9-b788-4227-a867-61420e8494f5\") " pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.377218 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsx2h\" (UniqueName: \"kubernetes.io/projected/7e1e0df9-b788-4227-a867-61420e8494f5-kube-api-access-fsx2h\") pod \"aodh-32db-account-create-update-k6bvn\" (UID: \"7e1e0df9-b788-4227-a867-61420e8494f5\") " pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.377718 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1e0df9-b788-4227-a867-61420e8494f5-operator-scripts\") pod \"aodh-32db-account-create-update-k6bvn\" (UID: \"7e1e0df9-b788-4227-a867-61420e8494f5\") " pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.398018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsx2h\" (UniqueName: \"kubernetes.io/projected/7e1e0df9-b788-4227-a867-61420e8494f5-kube-api-access-fsx2h\") pod \"aodh-32db-account-create-update-k6bvn\" (UID: \"7e1e0df9-b788-4227-a867-61420e8494f5\") " pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.466977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:17 crc kubenswrapper[4749]: I1129 02:56:17.878104 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-nkzq6"] Nov 29 02:56:17 crc kubenswrapper[4749]: W1129 02:56:17.880902 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2831df11_ac96_49f4_9669_9c549e61a190.slice/crio-8dac16b66bc8890d1c07b703699f8437537ef71998d22306fd4fac2fa7b1013a WatchSource:0}: Error finding container 8dac16b66bc8890d1c07b703699f8437537ef71998d22306fd4fac2fa7b1013a: Status 404 returned error can't find the container with id 8dac16b66bc8890d1c07b703699f8437537ef71998d22306fd4fac2fa7b1013a Nov 29 02:56:18 crc kubenswrapper[4749]: W1129 02:56:18.020738 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e1e0df9_b788_4227_a867_61420e8494f5.slice/crio-dcd5ea5b77e3e0738743f0dd7dd7c6a6af8fd195c8ad9172862939f536bc4f2c WatchSource:0}: Error finding container dcd5ea5b77e3e0738743f0dd7dd7c6a6af8fd195c8ad9172862939f536bc4f2c: Status 404 returned error can't find the container with id dcd5ea5b77e3e0738743f0dd7dd7c6a6af8fd195c8ad9172862939f536bc4f2c Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.023899 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-32db-account-create-update-k6bvn"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.048619 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hgh7f"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.061731 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5140-account-create-update-xjmf2"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.077906 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5140-account-create-update-xjmf2"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.092839 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hgh7f"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.104042 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-117b-account-create-update-tmlsv"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.116280 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-rs5r5"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.135361 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-117b-account-create-update-tmlsv"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.145961 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-rs5r5"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.157376 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fab4-account-create-update-vn8tp"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.167317 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fab4-account-create-update-vn8tp"] Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.538404 4749 generic.go:334] "Generic (PLEG): container finished" podID="2831df11-ac96-49f4-9669-9c549e61a190" containerID="a1fa6836123b024905c592bf3fe11a5ca39ccbbb57fff5711a6c40c2e802b1f4" exitCode=0 Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.538487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nkzq6" event={"ID":"2831df11-ac96-49f4-9669-9c549e61a190","Type":"ContainerDied","Data":"a1fa6836123b024905c592bf3fe11a5ca39ccbbb57fff5711a6c40c2e802b1f4"} Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.538563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nkzq6" event={"ID":"2831df11-ac96-49f4-9669-9c549e61a190","Type":"ContainerStarted","Data":"8dac16b66bc8890d1c07b703699f8437537ef71998d22306fd4fac2fa7b1013a"} Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.540295 4749 generic.go:334] "Generic (PLEG): container finished" podID="7e1e0df9-b788-4227-a867-61420e8494f5" containerID="77d01ee553bfe049ee0f3c19ea92b4455d0c514e96980994730f84588ede205d" exitCode=0 Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.540336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-32db-account-create-update-k6bvn" event={"ID":"7e1e0df9-b788-4227-a867-61420e8494f5","Type":"ContainerDied","Data":"77d01ee553bfe049ee0f3c19ea92b4455d0c514e96980994730f84588ede205d"} Nov 29 02:56:18 crc kubenswrapper[4749]: I1129 02:56:18.540368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-32db-account-create-update-k6bvn" event={"ID":"7e1e0df9-b788-4227-a867-61420e8494f5","Type":"ContainerStarted","Data":"dcd5ea5b77e3e0738743f0dd7dd7c6a6af8fd195c8ad9172862939f536bc4f2c"} Nov 29 02:56:19 crc kubenswrapper[4749]: I1129 02:56:19.089843 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336fb058-ef0f-445c-b051-d08c95431b6b" path="/var/lib/kubelet/pods/336fb058-ef0f-445c-b051-d08c95431b6b/volumes" Nov 29 02:56:19 crc kubenswrapper[4749]: I1129 02:56:19.092307 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5915b3e3-1865-484d-88aa-54633385cf59" path="/var/lib/kubelet/pods/5915b3e3-1865-484d-88aa-54633385cf59/volumes" Nov 29 02:56:19 crc kubenswrapper[4749]: I1129 02:56:19.094418 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8efc8ea-5df9-437b-a0cf-c65eb1e6b454" path="/var/lib/kubelet/pods/a8efc8ea-5df9-437b-a0cf-c65eb1e6b454/volumes" Nov 29 02:56:19 crc kubenswrapper[4749]: I1129 02:56:19.095965 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9eeacbb-6c08-475f-812d-bc6c94c22fe6" path="/var/lib/kubelet/pods/b9eeacbb-6c08-475f-812d-bc6c94c22fe6/volumes" Nov 29 02:56:19 crc kubenswrapper[4749]: I1129 02:56:19.101548 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90f876a-7350-43b0-ad17-c0579cbf013d" path="/var/lib/kubelet/pods/c90f876a-7350-43b0-ad17-c0579cbf013d/volumes" Nov 29 02:56:19 crc kubenswrapper[4749]: I1129 02:56:19.103983 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0a66458-2d12-4987-b43f-735fe5fb5528" path="/var/lib/kubelet/pods/f0a66458-2d12-4987-b43f-735fe5fb5528/volumes" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.053555 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.059077 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.136801 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2831df11-ac96-49f4-9669-9c549e61a190-operator-scripts\") pod \"2831df11-ac96-49f4-9669-9c549e61a190\" (UID: \"2831df11-ac96-49f4-9669-9c549e61a190\") " Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.137846 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2831df11-ac96-49f4-9669-9c549e61a190-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2831df11-ac96-49f4-9669-9c549e61a190" (UID: "2831df11-ac96-49f4-9669-9c549e61a190"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.138069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsx2h\" (UniqueName: \"kubernetes.io/projected/7e1e0df9-b788-4227-a867-61420e8494f5-kube-api-access-fsx2h\") pod \"7e1e0df9-b788-4227-a867-61420e8494f5\" (UID: \"7e1e0df9-b788-4227-a867-61420e8494f5\") " Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.138907 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1e0df9-b788-4227-a867-61420e8494f5-operator-scripts\") pod \"7e1e0df9-b788-4227-a867-61420e8494f5\" (UID: \"7e1e0df9-b788-4227-a867-61420e8494f5\") " Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.139065 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc7wz\" (UniqueName: \"kubernetes.io/projected/2831df11-ac96-49f4-9669-9c549e61a190-kube-api-access-jc7wz\") pod \"2831df11-ac96-49f4-9669-9c549e61a190\" (UID: \"2831df11-ac96-49f4-9669-9c549e61a190\") " Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.139506 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1e0df9-b788-4227-a867-61420e8494f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e1e0df9-b788-4227-a867-61420e8494f5" (UID: "7e1e0df9-b788-4227-a867-61420e8494f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.140121 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2831df11-ac96-49f4-9669-9c549e61a190-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.140286 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e1e0df9-b788-4227-a867-61420e8494f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.143403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1e0df9-b788-4227-a867-61420e8494f5-kube-api-access-fsx2h" (OuterVolumeSpecName: "kube-api-access-fsx2h") pod "7e1e0df9-b788-4227-a867-61420e8494f5" (UID: "7e1e0df9-b788-4227-a867-61420e8494f5"). InnerVolumeSpecName "kube-api-access-fsx2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.144244 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2831df11-ac96-49f4-9669-9c549e61a190-kube-api-access-jc7wz" (OuterVolumeSpecName: "kube-api-access-jc7wz") pod "2831df11-ac96-49f4-9669-9c549e61a190" (UID: "2831df11-ac96-49f4-9669-9c549e61a190"). InnerVolumeSpecName "kube-api-access-jc7wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.242164 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc7wz\" (UniqueName: \"kubernetes.io/projected/2831df11-ac96-49f4-9669-9c549e61a190-kube-api-access-jc7wz\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.242439 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsx2h\" (UniqueName: \"kubernetes.io/projected/7e1e0df9-b788-4227-a867-61420e8494f5-kube-api-access-fsx2h\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.569070 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-32db-account-create-update-k6bvn" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.569057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-32db-account-create-update-k6bvn" event={"ID":"7e1e0df9-b788-4227-a867-61420e8494f5","Type":"ContainerDied","Data":"dcd5ea5b77e3e0738743f0dd7dd7c6a6af8fd195c8ad9172862939f536bc4f2c"} Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.569287 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd5ea5b77e3e0738743f0dd7dd7c6a6af8fd195c8ad9172862939f536bc4f2c" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.572329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-nkzq6" event={"ID":"2831df11-ac96-49f4-9669-9c549e61a190","Type":"ContainerDied","Data":"8dac16b66bc8890d1c07b703699f8437537ef71998d22306fd4fac2fa7b1013a"} Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.572376 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-nkzq6" Nov 29 02:56:20 crc kubenswrapper[4749]: I1129 02:56:20.572402 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dac16b66bc8890d1c07b703699f8437537ef71998d22306fd4fac2fa7b1013a" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.517538 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-8ccv8"] Nov 29 02:56:22 crc kubenswrapper[4749]: E1129 02:56:22.519019 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1e0df9-b788-4227-a867-61420e8494f5" containerName="mariadb-account-create-update" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.519048 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1e0df9-b788-4227-a867-61420e8494f5" containerName="mariadb-account-create-update" Nov 29 02:56:22 crc kubenswrapper[4749]: E1129 02:56:22.519112 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2831df11-ac96-49f4-9669-9c549e61a190" containerName="mariadb-database-create" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.519125 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2831df11-ac96-49f4-9669-9c549e61a190" containerName="mariadb-database-create" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.519540 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2831df11-ac96-49f4-9669-9c549e61a190" containerName="mariadb-database-create" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.519581 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1e0df9-b788-4227-a867-61420e8494f5" containerName="mariadb-account-create-update" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.520925 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.524606 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.524838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.526374 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.529143 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8ccv8"] Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.530582 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-4lt9x" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.597607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vssph\" (UniqueName: \"kubernetes.io/projected/881ef581-f476-4e44-b01f-797f4fa23d1f-kube-api-access-vssph\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.597656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-scripts\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.597816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-config-data\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.597881 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-combined-ca-bundle\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.701052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vssph\" (UniqueName: \"kubernetes.io/projected/881ef581-f476-4e44-b01f-797f4fa23d1f-kube-api-access-vssph\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.701424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-scripts\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.701611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-config-data\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.701738 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-combined-ca-bundle\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.708141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-combined-ca-bundle\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.708609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-scripts\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.709167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-config-data\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.714538 4749 scope.go:117] "RemoveContainer" containerID="14b4d2fe98f5cef9a7413baeb1ee82059abf4722439711d9d1988bd31c44fe1e" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.734818 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vssph\" (UniqueName: \"kubernetes.io/projected/881ef581-f476-4e44-b01f-797f4fa23d1f-kube-api-access-vssph\") pod \"aodh-db-sync-8ccv8\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.828156 4749 scope.go:117] "RemoveContainer" containerID="71e5a30db6c4b1de3d06b6020bfe9670a9c0edd6f88dc7fdd63b171397efce0e" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.843754 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:22 crc kubenswrapper[4749]: I1129 02:56:22.930522 4749 scope.go:117] "RemoveContainer" containerID="e15febb8618783e33f4b1b8e1b15ecf071c9e7f8d01ae59128af43c4966a1e7b" Nov 29 02:56:23 crc kubenswrapper[4749]: I1129 02:56:23.008425 4749 scope.go:117] "RemoveContainer" containerID="67af94c1d6da024cebfe8f79403c4f2253203067dfb763245b8e322eda8571d1" Nov 29 02:56:23 crc kubenswrapper[4749]: I1129 02:56:23.046750 4749 scope.go:117] "RemoveContainer" containerID="30c9e3e80aca8a2d9aab37673fe910cc0a3959848920f32cc9409010ffda0b62" Nov 29 02:56:23 crc kubenswrapper[4749]: I1129 02:56:23.097734 4749 scope.go:117] "RemoveContainer" containerID="5018f581bbf19aeca63686918b4973bfb894222ff0095532434bb8e4451105d9" Nov 29 02:56:23 crc kubenswrapper[4749]: I1129 02:56:23.147138 4749 scope.go:117] "RemoveContainer" containerID="0081f70d8987acdf116f6c43199fdcdeafe856189fd8154e95fe7e4c85f2f0a8" Nov 29 02:56:23 crc kubenswrapper[4749]: W1129 02:56:23.514195 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod881ef581_f476_4e44_b01f_797f4fa23d1f.slice/crio-4aa768c56b933601e8d10c54a2bc47cd604aa364c332d188d78f38e3785c8258 WatchSource:0}: Error finding container 4aa768c56b933601e8d10c54a2bc47cd604aa364c332d188d78f38e3785c8258: Status 404 returned error can't find the container with id 4aa768c56b933601e8d10c54a2bc47cd604aa364c332d188d78f38e3785c8258 Nov 29 02:56:23 crc kubenswrapper[4749]: I1129 02:56:23.514672 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8ccv8"] Nov 29 02:56:23 crc kubenswrapper[4749]: I1129 02:56:23.604626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8ccv8" event={"ID":"881ef581-f476-4e44-b01f-797f4fa23d1f","Type":"ContainerStarted","Data":"4aa768c56b933601e8d10c54a2bc47cd604aa364c332d188d78f38e3785c8258"} Nov 29 02:56:25 crc kubenswrapper[4749]: I1129 02:56:25.374364 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:56:25 crc kubenswrapper[4749]: I1129 02:56:25.374890 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:56:27 crc kubenswrapper[4749]: I1129 02:56:27.061825 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n2nl8"] Nov 29 02:56:27 crc kubenswrapper[4749]: I1129 02:56:27.098356 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n2nl8"] Nov 29 02:56:28 crc kubenswrapper[4749]: I1129 02:56:28.658277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8ccv8" event={"ID":"881ef581-f476-4e44-b01f-797f4fa23d1f","Type":"ContainerStarted","Data":"8df11921057fe81797b26a62a65f836e299eaf6d2bf6468123ef8d1956082de2"} Nov 29 02:56:28 crc kubenswrapper[4749]: I1129 02:56:28.689755 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-8ccv8" podStartSLOduration=2.5970201299999998 podStartE2EDuration="6.689735323s" podCreationTimestamp="2025-11-29 02:56:22 +0000 UTC" firstStartedPulling="2025-11-29 02:56:23.517012321 +0000 UTC m=+6326.689162168" lastFinishedPulling="2025-11-29 02:56:27.609727494 +0000 UTC m=+6330.781877361" observedRunningTime="2025-11-29 02:56:28.688186605 +0000 UTC m=+6331.860336462" watchObservedRunningTime="2025-11-29 02:56:28.689735323 +0000 UTC m=+6331.861885190" Nov 29 02:56:29 crc kubenswrapper[4749]: I1129 02:56:29.090303 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbec9973-7a01-4dcd-af32-be7f72b9d461" path="/var/lib/kubelet/pods/fbec9973-7a01-4dcd-af32-be7f72b9d461/volumes" Nov 29 02:56:30 crc kubenswrapper[4749]: I1129 02:56:30.685990 4749 generic.go:334] "Generic (PLEG): container finished" podID="881ef581-f476-4e44-b01f-797f4fa23d1f" containerID="8df11921057fe81797b26a62a65f836e299eaf6d2bf6468123ef8d1956082de2" exitCode=0 Nov 29 02:56:30 crc kubenswrapper[4749]: I1129 02:56:30.686073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8ccv8" event={"ID":"881ef581-f476-4e44-b01f-797f4fa23d1f","Type":"ContainerDied","Data":"8df11921057fe81797b26a62a65f836e299eaf6d2bf6468123ef8d1956082de2"} Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.197962 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.257160 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-combined-ca-bundle\") pod \"881ef581-f476-4e44-b01f-797f4fa23d1f\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.257292 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vssph\" (UniqueName: \"kubernetes.io/projected/881ef581-f476-4e44-b01f-797f4fa23d1f-kube-api-access-vssph\") pod \"881ef581-f476-4e44-b01f-797f4fa23d1f\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.257419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-config-data\") pod \"881ef581-f476-4e44-b01f-797f4fa23d1f\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.257482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-scripts\") pod \"881ef581-f476-4e44-b01f-797f4fa23d1f\" (UID: \"881ef581-f476-4e44-b01f-797f4fa23d1f\") " Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.266465 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881ef581-f476-4e44-b01f-797f4fa23d1f-kube-api-access-vssph" (OuterVolumeSpecName: "kube-api-access-vssph") pod "881ef581-f476-4e44-b01f-797f4fa23d1f" (UID: "881ef581-f476-4e44-b01f-797f4fa23d1f"). InnerVolumeSpecName "kube-api-access-vssph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.268482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-scripts" (OuterVolumeSpecName: "scripts") pod "881ef581-f476-4e44-b01f-797f4fa23d1f" (UID: "881ef581-f476-4e44-b01f-797f4fa23d1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.289837 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "881ef581-f476-4e44-b01f-797f4fa23d1f" (UID: "881ef581-f476-4e44-b01f-797f4fa23d1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.296225 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-config-data" (OuterVolumeSpecName: "config-data") pod "881ef581-f476-4e44-b01f-797f4fa23d1f" (UID: "881ef581-f476-4e44-b01f-797f4fa23d1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.360415 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vssph\" (UniqueName: \"kubernetes.io/projected/881ef581-f476-4e44-b01f-797f4fa23d1f-kube-api-access-vssph\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.360461 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.360477 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.360488 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/881ef581-f476-4e44-b01f-797f4fa23d1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.715702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8ccv8" event={"ID":"881ef581-f476-4e44-b01f-797f4fa23d1f","Type":"ContainerDied","Data":"4aa768c56b933601e8d10c54a2bc47cd604aa364c332d188d78f38e3785c8258"} Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.715758 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aa768c56b933601e8d10c54a2bc47cd604aa364c332d188d78f38e3785c8258" Nov 29 02:56:32 crc kubenswrapper[4749]: I1129 02:56:32.715815 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8ccv8" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.286297 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.518779 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-984d6"] Nov 29 02:56:36 crc kubenswrapper[4749]: E1129 02:56:36.519521 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881ef581-f476-4e44-b01f-797f4fa23d1f" containerName="aodh-db-sync" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.519553 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="881ef581-f476-4e44-b01f-797f4fa23d1f" containerName="aodh-db-sync" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.519998 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="881ef581-f476-4e44-b01f-797f4fa23d1f" containerName="aodh-db-sync" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.523053 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.532783 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-984d6"] Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.665932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npswl\" (UniqueName: \"kubernetes.io/projected/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-kube-api-access-npswl\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.666398 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-utilities\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.666538 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-catalog-content\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.768896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-utilities\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.769006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-catalog-content\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.769218 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npswl\" (UniqueName: \"kubernetes.io/projected/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-kube-api-access-npswl\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.769574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-utilities\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.769663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-catalog-content\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.804116 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npswl\" (UniqueName: \"kubernetes.io/projected/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-kube-api-access-npswl\") pod \"certified-operators-984d6\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:36 crc kubenswrapper[4749]: I1129 02:56:36.850396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.113135 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.116363 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.116460 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.121740 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.122022 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.122157 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-4lt9x" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.178900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwlj\" (UniqueName: \"kubernetes.io/projected/c4f049eb-f374-4140-b694-2af94e54001e-kube-api-access-xvwlj\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.179180 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-config-data\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.179373 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-scripts\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.179504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.281498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwlj\" (UniqueName: \"kubernetes.io/projected/c4f049eb-f374-4140-b694-2af94e54001e-kube-api-access-xvwlj\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.281578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-config-data\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.281638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-scripts\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.281681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.290026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-scripts\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.290795 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.296377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4f049eb-f374-4140-b694-2af94e54001e-config-data\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.305820 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwlj\" (UniqueName: \"kubernetes.io/projected/c4f049eb-f374-4140-b694-2af94e54001e-kube-api-access-xvwlj\") pod \"aodh-0\" (UID: \"c4f049eb-f374-4140-b694-2af94e54001e\") " pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.402756 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-984d6"] Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.440562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.807344 4749 generic.go:334] "Generic (PLEG): container finished" podID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerID="88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5" exitCode=0 Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.807441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-984d6" event={"ID":"9006334e-6cca-4531-95a8-6bc7b2fa7b4d","Type":"ContainerDied","Data":"88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5"} Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.807937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-984d6" event={"ID":"9006334e-6cca-4531-95a8-6bc7b2fa7b4d","Type":"ContainerStarted","Data":"ae9d190e58bbcfd66e134b93237f50b07012f844fcc84aeb5a3f84f2ec8adac7"} Nov 29 02:56:37 crc kubenswrapper[4749]: I1129 02:56:37.888700 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 29 02:56:37 crc kubenswrapper[4749]: W1129 02:56:37.888887 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4f049eb_f374_4140_b694_2af94e54001e.slice/crio-a559cb45a866823d21e50eb79d0c3a6c62b0864cfd0967cc3c9794ca0d6a1591 WatchSource:0}: Error finding container a559cb45a866823d21e50eb79d0c3a6c62b0864cfd0967cc3c9794ca0d6a1591: Status 404 returned error can't find the container with id a559cb45a866823d21e50eb79d0c3a6c62b0864cfd0967cc3c9794ca0d6a1591 Nov 29 02:56:38 crc kubenswrapper[4749]: I1129 02:56:38.732789 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:38 crc kubenswrapper[4749]: I1129 02:56:38.733516 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="ceilometer-central-agent" containerID="cri-o://19f1bb570219e384c0b086ef6f7d9eb806a02574c2ccc627e647b48a6a809ad9" gracePeriod=30 Nov 29 02:56:38 crc kubenswrapper[4749]: I1129 02:56:38.733797 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="proxy-httpd" containerID="cri-o://fbc2e75d6ab9a5f7f585c4986ee37b0078c1050d45562a709893ad49ddf577a0" gracePeriod=30 Nov 29 02:56:38 crc kubenswrapper[4749]: I1129 02:56:38.733917 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="sg-core" containerID="cri-o://ba99f5ee9386ea3c4abb3a01a964b876a74380b9523ffa0286363aa8e094612a" gracePeriod=30 Nov 29 02:56:38 crc kubenswrapper[4749]: I1129 02:56:38.733944 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="ceilometer-notification-agent" containerID="cri-o://f3f8f05a44984ecd1d427d13ca004f02a81b0ad058a17e4468ecba4ac8d429bf" gracePeriod=30 Nov 29 02:56:38 crc kubenswrapper[4749]: I1129 02:56:38.819688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4f049eb-f374-4140-b694-2af94e54001e","Type":"ContainerStarted","Data":"a559cb45a866823d21e50eb79d0c3a6c62b0864cfd0967cc3c9794ca0d6a1591"} Nov 29 02:56:39 crc kubenswrapper[4749]: I1129 02:56:39.844458 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerID="fbc2e75d6ab9a5f7f585c4986ee37b0078c1050d45562a709893ad49ddf577a0" exitCode=0 Nov 29 02:56:39 crc kubenswrapper[4749]: I1129 02:56:39.844792 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerID="ba99f5ee9386ea3c4abb3a01a964b876a74380b9523ffa0286363aa8e094612a" exitCode=2 Nov 29 02:56:39 crc kubenswrapper[4749]: I1129 02:56:39.844800 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerID="19f1bb570219e384c0b086ef6f7d9eb806a02574c2ccc627e647b48a6a809ad9" exitCode=0 Nov 29 02:56:39 crc kubenswrapper[4749]: I1129 02:56:39.844520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerDied","Data":"fbc2e75d6ab9a5f7f585c4986ee37b0078c1050d45562a709893ad49ddf577a0"} Nov 29 02:56:39 crc kubenswrapper[4749]: I1129 02:56:39.844858 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerDied","Data":"ba99f5ee9386ea3c4abb3a01a964b876a74380b9523ffa0286363aa8e094612a"} Nov 29 02:56:39 crc kubenswrapper[4749]: I1129 02:56:39.844869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerDied","Data":"19f1bb570219e384c0b086ef6f7d9eb806a02574c2ccc627e647b48a6a809ad9"} Nov 29 02:56:39 crc kubenswrapper[4749]: I1129 02:56:39.847081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-984d6" event={"ID":"9006334e-6cca-4531-95a8-6bc7b2fa7b4d","Type":"ContainerStarted","Data":"f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd"} Nov 29 02:56:39 crc kubenswrapper[4749]: I1129 02:56:39.848288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4f049eb-f374-4140-b694-2af94e54001e","Type":"ContainerStarted","Data":"27f4a6013ccdadc6fa3dfc7714d28fbb1e510c3ebf21df85c0527cc69584e243"} Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.865350 4749 generic.go:334] "Generic (PLEG): container finished" podID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerID="f3f8f05a44984ecd1d427d13ca004f02a81b0ad058a17e4468ecba4ac8d429bf" exitCode=0 Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.865747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerDied","Data":"f3f8f05a44984ecd1d427d13ca004f02a81b0ad058a17e4468ecba4ac8d429bf"} Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.866122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"52dda423-f967-4f93-8b4b-90f6b130ea98","Type":"ContainerDied","Data":"b9d5e62ef3a0ec7a2228ac4b6ff7c4a08189fe2a22627309d3d8d858bf96db08"} Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.866137 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9d5e62ef3a0ec7a2228ac4b6ff7c4a08189fe2a22627309d3d8d858bf96db08" Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.867999 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.868034 4749 generic.go:334] "Generic (PLEG): container finished" podID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerID="f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd" exitCode=0 Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.868053 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-984d6" event={"ID":"9006334e-6cca-4531-95a8-6bc7b2fa7b4d","Type":"ContainerDied","Data":"f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd"} Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960110 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkxq6\" (UniqueName: \"kubernetes.io/projected/52dda423-f967-4f93-8b4b-90f6b130ea98-kube-api-access-pkxq6\") pod \"52dda423-f967-4f93-8b4b-90f6b130ea98\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-sg-core-conf-yaml\") pod \"52dda423-f967-4f93-8b4b-90f6b130ea98\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-log-httpd\") pod \"52dda423-f967-4f93-8b4b-90f6b130ea98\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960321 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-config-data\") pod \"52dda423-f967-4f93-8b4b-90f6b130ea98\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-run-httpd\") pod \"52dda423-f967-4f93-8b4b-90f6b130ea98\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960390 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-combined-ca-bundle\") pod \"52dda423-f967-4f93-8b4b-90f6b130ea98\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-scripts\") pod \"52dda423-f967-4f93-8b4b-90f6b130ea98\" (UID: \"52dda423-f967-4f93-8b4b-90f6b130ea98\") " Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "52dda423-f967-4f93-8b4b-90f6b130ea98" (UID: "52dda423-f967-4f93-8b4b-90f6b130ea98"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.960967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "52dda423-f967-4f93-8b4b-90f6b130ea98" (UID: "52dda423-f967-4f93-8b4b-90f6b130ea98"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.961350 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.961423 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/52dda423-f967-4f93-8b4b-90f6b130ea98-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.969370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-scripts" (OuterVolumeSpecName: "scripts") pod "52dda423-f967-4f93-8b4b-90f6b130ea98" (UID: "52dda423-f967-4f93-8b4b-90f6b130ea98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.969590 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52dda423-f967-4f93-8b4b-90f6b130ea98-kube-api-access-pkxq6" (OuterVolumeSpecName: "kube-api-access-pkxq6") pod "52dda423-f967-4f93-8b4b-90f6b130ea98" (UID: "52dda423-f967-4f93-8b4b-90f6b130ea98"). InnerVolumeSpecName "kube-api-access-pkxq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:56:40 crc kubenswrapper[4749]: I1129 02:56:40.988498 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "52dda423-f967-4f93-8b4b-90f6b130ea98" (UID: "52dda423-f967-4f93-8b4b-90f6b130ea98"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.062562 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.062725 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkxq6\" (UniqueName: \"kubernetes.io/projected/52dda423-f967-4f93-8b4b-90f6b130ea98-kube-api-access-pkxq6\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.063014 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.068275 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52dda423-f967-4f93-8b4b-90f6b130ea98" (UID: "52dda423-f967-4f93-8b4b-90f6b130ea98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.068750 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-config-data" (OuterVolumeSpecName: "config-data") pod "52dda423-f967-4f93-8b4b-90f6b130ea98" (UID: "52dda423-f967-4f93-8b4b-90f6b130ea98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.165735 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.165771 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52dda423-f967-4f93-8b4b-90f6b130ea98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.883736 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.981912 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:41 crc kubenswrapper[4749]: I1129 02:56:41.993013 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.007566 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:42 crc kubenswrapper[4749]: E1129 02:56:42.008462 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="ceilometer-notification-agent" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.008532 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="ceilometer-notification-agent" Nov 29 02:56:42 crc kubenswrapper[4749]: E1129 02:56:42.008669 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="ceilometer-central-agent" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.008725 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="ceilometer-central-agent" Nov 29 02:56:42 crc kubenswrapper[4749]: E1129 02:56:42.008789 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="sg-core" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.008839 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="sg-core" Nov 29 02:56:42 crc kubenswrapper[4749]: E1129 02:56:42.008891 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="proxy-httpd" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.008954 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="proxy-httpd" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.009208 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="ceilometer-central-agent" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.009278 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="proxy-httpd" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.009344 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="sg-core" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.009404 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" containerName="ceilometer-notification-agent" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.011371 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.015358 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.015365 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.023871 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.185344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-scripts\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.185387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-log-httpd\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.185522 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.185594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhbf7\" (UniqueName: \"kubernetes.io/projected/eef6211e-89be-4693-83c8-9b077602da01-kube-api-access-fhbf7\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.185617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-run-httpd\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.185640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-config-data\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.185687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.287272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.287411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhbf7\" (UniqueName: \"kubernetes.io/projected/eef6211e-89be-4693-83c8-9b077602da01-kube-api-access-fhbf7\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.287445 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-run-httpd\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.287476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-config-data\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.287542 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.287600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-scripts\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.287629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-log-httpd\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.288652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-log-httpd\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.288890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-run-httpd\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.292904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.292930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-scripts\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.294648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-config-data\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.300108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.317143 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhbf7\" (UniqueName: \"kubernetes.io/projected/eef6211e-89be-4693-83c8-9b077602da01-kube-api-access-fhbf7\") pod \"ceilometer-0\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.372164 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.910441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-984d6" event={"ID":"9006334e-6cca-4531-95a8-6bc7b2fa7b4d","Type":"ContainerStarted","Data":"57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef"} Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.913426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4f049eb-f374-4140-b694-2af94e54001e","Type":"ContainerStarted","Data":"a96570e967d28c1549b87a87dc5d3f65da7df89d4c6eacb2321280c36c9e2e54"} Nov 29 02:56:42 crc kubenswrapper[4749]: I1129 02:56:42.967601 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-984d6" podStartSLOduration=3.012376835 podStartE2EDuration="6.96758451s" podCreationTimestamp="2025-11-29 02:56:36 +0000 UTC" firstStartedPulling="2025-11-29 02:56:37.809548695 +0000 UTC m=+6340.981698552" lastFinishedPulling="2025-11-29 02:56:41.76475636 +0000 UTC m=+6344.936906227" observedRunningTime="2025-11-29 02:56:42.926648696 +0000 UTC m=+6346.098798563" watchObservedRunningTime="2025-11-29 02:56:42.96758451 +0000 UTC m=+6346.139734377" Nov 29 02:56:43 crc kubenswrapper[4749]: I1129 02:56:43.089303 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52dda423-f967-4f93-8b4b-90f6b130ea98" path="/var/lib/kubelet/pods/52dda423-f967-4f93-8b4b-90f6b130ea98/volumes" Nov 29 02:56:43 crc kubenswrapper[4749]: I1129 02:56:43.090695 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:56:43 crc kubenswrapper[4749]: W1129 02:56:43.094138 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef6211e_89be_4693_83c8_9b077602da01.slice/crio-ad7482909264a43500cd2873e01fc042f110bafe7b87fefa935ccebee1614bbd WatchSource:0}: Error finding container ad7482909264a43500cd2873e01fc042f110bafe7b87fefa935ccebee1614bbd: Status 404 returned error can't find the container with id ad7482909264a43500cd2873e01fc042f110bafe7b87fefa935ccebee1614bbd Nov 29 02:56:43 crc kubenswrapper[4749]: I1129 02:56:43.930583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerStarted","Data":"ad7482909264a43500cd2873e01fc042f110bafe7b87fefa935ccebee1614bbd"} Nov 29 02:56:44 crc kubenswrapper[4749]: I1129 02:56:44.955022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerStarted","Data":"eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4"} Nov 29 02:56:44 crc kubenswrapper[4749]: I1129 02:56:44.959326 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4f049eb-f374-4140-b694-2af94e54001e","Type":"ContainerStarted","Data":"0ef4e739d454e5ffc5c9cdaabe7001333e8812c82ac9c20ac75a75655e2d8a15"} Nov 29 02:56:45 crc kubenswrapper[4749]: I1129 02:56:45.972585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerStarted","Data":"532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe"} Nov 29 02:56:46 crc kubenswrapper[4749]: I1129 02:56:46.094307 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cmpmq"] Nov 29 02:56:46 crc kubenswrapper[4749]: I1129 02:56:46.109920 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cmpmq"] Nov 29 02:56:46 crc kubenswrapper[4749]: I1129 02:56:46.853045 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:46 crc kubenswrapper[4749]: I1129 02:56:46.853663 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:46 crc kubenswrapper[4749]: I1129 02:56:46.906938 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:46 crc kubenswrapper[4749]: I1129 02:56:46.983306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerStarted","Data":"c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5"} Nov 29 02:56:46 crc kubenswrapper[4749]: I1129 02:56:46.987482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c4f049eb-f374-4140-b694-2af94e54001e","Type":"ContainerStarted","Data":"fe83789c4afa417cdd44748bf79a4089d8f8a93f318575dc64cbabba71573b0d"} Nov 29 02:56:47 crc kubenswrapper[4749]: I1129 02:56:47.026633 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.597019815 podStartE2EDuration="10.026618655s" podCreationTimestamp="2025-11-29 02:56:37 +0000 UTC" firstStartedPulling="2025-11-29 02:56:37.893459621 +0000 UTC m=+6341.065609478" lastFinishedPulling="2025-11-29 02:56:46.323058461 +0000 UTC m=+6349.495208318" observedRunningTime="2025-11-29 02:56:47.021532042 +0000 UTC m=+6350.193681899" watchObservedRunningTime="2025-11-29 02:56:47.026618655 +0000 UTC m=+6350.198768512" Nov 29 02:56:47 crc kubenswrapper[4749]: I1129 02:56:47.047943 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ghjjm"] Nov 29 02:56:47 crc kubenswrapper[4749]: I1129 02:56:47.061120 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ghjjm"] Nov 29 02:56:47 crc kubenswrapper[4749]: I1129 02:56:47.120632 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6410f87-bd21-41e5-8f40-f7a9ea97da54" path="/var/lib/kubelet/pods/d6410f87-bd21-41e5-8f40-f7a9ea97da54/volumes" Nov 29 02:56:47 crc kubenswrapper[4749]: I1129 02:56:47.140553 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1" path="/var/lib/kubelet/pods/ffc5d1a2-4a00-4d6c-b8ed-4672d67ec7f1/volumes" Nov 29 02:56:49 crc kubenswrapper[4749]: I1129 02:56:49.012104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerStarted","Data":"a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38"} Nov 29 02:56:49 crc kubenswrapper[4749]: I1129 02:56:49.013721 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.754780 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.896487297 podStartE2EDuration="10.754739698s" podCreationTimestamp="2025-11-29 02:56:41 +0000 UTC" firstStartedPulling="2025-11-29 02:56:43.096770685 +0000 UTC m=+6346.268920552" lastFinishedPulling="2025-11-29 02:56:47.955023086 +0000 UTC m=+6351.127172953" observedRunningTime="2025-11-29 02:56:49.037086456 +0000 UTC m=+6352.209236363" watchObservedRunningTime="2025-11-29 02:56:51.754739698 +0000 UTC m=+6354.926889555" Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.767009 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-4c2zm"] Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.768649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.780683 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-4c2zm"] Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.870916 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-0243-account-create-update-r6mf6"] Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.872968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.876523 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.882598 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-0243-account-create-update-r6mf6"] Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.913661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzc66\" (UniqueName: \"kubernetes.io/projected/2799ee87-ddc8-4e43-abbb-26744d666d22-kube-api-access-nzc66\") pod \"manila-db-create-4c2zm\" (UID: \"2799ee87-ddc8-4e43-abbb-26744d666d22\") " pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:51 crc kubenswrapper[4749]: I1129 02:56:51.913828 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2799ee87-ddc8-4e43-abbb-26744d666d22-operator-scripts\") pod \"manila-db-create-4c2zm\" (UID: \"2799ee87-ddc8-4e43-abbb-26744d666d22\") " pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.015319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f86295e-c261-49f4-ae57-0c975f06c73b-operator-scripts\") pod \"manila-0243-account-create-update-r6mf6\" (UID: \"7f86295e-c261-49f4-ae57-0c975f06c73b\") " pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.015401 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzc66\" (UniqueName: \"kubernetes.io/projected/2799ee87-ddc8-4e43-abbb-26744d666d22-kube-api-access-nzc66\") pod \"manila-db-create-4c2zm\" (UID: \"2799ee87-ddc8-4e43-abbb-26744d666d22\") " pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.015534 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk8lr\" (UniqueName: \"kubernetes.io/projected/7f86295e-c261-49f4-ae57-0c975f06c73b-kube-api-access-fk8lr\") pod \"manila-0243-account-create-update-r6mf6\" (UID: \"7f86295e-c261-49f4-ae57-0c975f06c73b\") " pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.015623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2799ee87-ddc8-4e43-abbb-26744d666d22-operator-scripts\") pod \"manila-db-create-4c2zm\" (UID: \"2799ee87-ddc8-4e43-abbb-26744d666d22\") " pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.016567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2799ee87-ddc8-4e43-abbb-26744d666d22-operator-scripts\") pod \"manila-db-create-4c2zm\" (UID: \"2799ee87-ddc8-4e43-abbb-26744d666d22\") " pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.036459 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzc66\" (UniqueName: \"kubernetes.io/projected/2799ee87-ddc8-4e43-abbb-26744d666d22-kube-api-access-nzc66\") pod \"manila-db-create-4c2zm\" (UID: \"2799ee87-ddc8-4e43-abbb-26744d666d22\") " pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.086880 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.117439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f86295e-c261-49f4-ae57-0c975f06c73b-operator-scripts\") pod \"manila-0243-account-create-update-r6mf6\" (UID: \"7f86295e-c261-49f4-ae57-0c975f06c73b\") " pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.117543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk8lr\" (UniqueName: \"kubernetes.io/projected/7f86295e-c261-49f4-ae57-0c975f06c73b-kube-api-access-fk8lr\") pod \"manila-0243-account-create-update-r6mf6\" (UID: \"7f86295e-c261-49f4-ae57-0c975f06c73b\") " pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.119027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f86295e-c261-49f4-ae57-0c975f06c73b-operator-scripts\") pod \"manila-0243-account-create-update-r6mf6\" (UID: \"7f86295e-c261-49f4-ae57-0c975f06c73b\") " pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.136673 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk8lr\" (UniqueName: \"kubernetes.io/projected/7f86295e-c261-49f4-ae57-0c975f06c73b-kube-api-access-fk8lr\") pod \"manila-0243-account-create-update-r6mf6\" (UID: \"7f86295e-c261-49f4-ae57-0c975f06c73b\") " pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.194677 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:52 crc kubenswrapper[4749]: W1129 02:56:52.706829 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2799ee87_ddc8_4e43_abbb_26744d666d22.slice/crio-b121c587dacfe43757d25f0e18a3e2a2ce979c5e6e71e3267a503b8534007044 WatchSource:0}: Error finding container b121c587dacfe43757d25f0e18a3e2a2ce979c5e6e71e3267a503b8534007044: Status 404 returned error can't find the container with id b121c587dacfe43757d25f0e18a3e2a2ce979c5e6e71e3267a503b8534007044 Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.711342 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-4c2zm"] Nov 29 02:56:52 crc kubenswrapper[4749]: I1129 02:56:52.809942 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-0243-account-create-update-r6mf6"] Nov 29 02:56:52 crc kubenswrapper[4749]: W1129 02:56:52.832887 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f86295e_c261_49f4_ae57_0c975f06c73b.slice/crio-99377d895f3048478fab9ff4e6216a4743f449473f31e2d73107dd448cee2c9c WatchSource:0}: Error finding container 99377d895f3048478fab9ff4e6216a4743f449473f31e2d73107dd448cee2c9c: Status 404 returned error can't find the container with id 99377d895f3048478fab9ff4e6216a4743f449473f31e2d73107dd448cee2c9c Nov 29 02:56:53 crc kubenswrapper[4749]: I1129 02:56:53.052172 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4c2zm" event={"ID":"2799ee87-ddc8-4e43-abbb-26744d666d22","Type":"ContainerStarted","Data":"de2258c4a488bd1526cd672041c9cd0cf7e3fefbc5abd1dcd36f07790100992b"} Nov 29 02:56:53 crc kubenswrapper[4749]: I1129 02:56:53.052236 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4c2zm" event={"ID":"2799ee87-ddc8-4e43-abbb-26744d666d22","Type":"ContainerStarted","Data":"b121c587dacfe43757d25f0e18a3e2a2ce979c5e6e71e3267a503b8534007044"} Nov 29 02:56:53 crc kubenswrapper[4749]: I1129 02:56:53.055499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-0243-account-create-update-r6mf6" event={"ID":"7f86295e-c261-49f4-ae57-0c975f06c73b","Type":"ContainerStarted","Data":"ff6291ff7f4fdeec2d4f0802ca167a7fb809ae0288ebf57afcf4936a8e1823af"} Nov 29 02:56:53 crc kubenswrapper[4749]: I1129 02:56:53.055543 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-0243-account-create-update-r6mf6" event={"ID":"7f86295e-c261-49f4-ae57-0c975f06c73b","Type":"ContainerStarted","Data":"99377d895f3048478fab9ff4e6216a4743f449473f31e2d73107dd448cee2c9c"} Nov 29 02:56:53 crc kubenswrapper[4749]: I1129 02:56:53.072148 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-4c2zm" podStartSLOduration=2.072131199 podStartE2EDuration="2.072131199s" podCreationTimestamp="2025-11-29 02:56:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:56:53.065290553 +0000 UTC m=+6356.237440410" watchObservedRunningTime="2025-11-29 02:56:53.072131199 +0000 UTC m=+6356.244281066" Nov 29 02:56:53 crc kubenswrapper[4749]: I1129 02:56:53.092575 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-0243-account-create-update-r6mf6" podStartSLOduration=2.092555574 podStartE2EDuration="2.092555574s" podCreationTimestamp="2025-11-29 02:56:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:56:53.081132927 +0000 UTC m=+6356.253282844" watchObservedRunningTime="2025-11-29 02:56:53.092555574 +0000 UTC m=+6356.264705421" Nov 29 02:56:54 crc kubenswrapper[4749]: I1129 02:56:54.071548 4749 generic.go:334] "Generic (PLEG): container finished" podID="2799ee87-ddc8-4e43-abbb-26744d666d22" containerID="de2258c4a488bd1526cd672041c9cd0cf7e3fefbc5abd1dcd36f07790100992b" exitCode=0 Nov 29 02:56:54 crc kubenswrapper[4749]: I1129 02:56:54.071670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4c2zm" event={"ID":"2799ee87-ddc8-4e43-abbb-26744d666d22","Type":"ContainerDied","Data":"de2258c4a488bd1526cd672041c9cd0cf7e3fefbc5abd1dcd36f07790100992b"} Nov 29 02:56:54 crc kubenswrapper[4749]: I1129 02:56:54.075335 4749 generic.go:334] "Generic (PLEG): container finished" podID="7f86295e-c261-49f4-ae57-0c975f06c73b" containerID="ff6291ff7f4fdeec2d4f0802ca167a7fb809ae0288ebf57afcf4936a8e1823af" exitCode=0 Nov 29 02:56:54 crc kubenswrapper[4749]: I1129 02:56:54.075375 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-0243-account-create-update-r6mf6" event={"ID":"7f86295e-c261-49f4-ae57-0c975f06c73b","Type":"ContainerDied","Data":"ff6291ff7f4fdeec2d4f0802ca167a7fb809ae0288ebf57afcf4936a8e1823af"} Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.373953 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.374504 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.730046 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.736132 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.750498 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2799ee87-ddc8-4e43-abbb-26744d666d22-operator-scripts\") pod \"2799ee87-ddc8-4e43-abbb-26744d666d22\" (UID: \"2799ee87-ddc8-4e43-abbb-26744d666d22\") " Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.750662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk8lr\" (UniqueName: \"kubernetes.io/projected/7f86295e-c261-49f4-ae57-0c975f06c73b-kube-api-access-fk8lr\") pod \"7f86295e-c261-49f4-ae57-0c975f06c73b\" (UID: \"7f86295e-c261-49f4-ae57-0c975f06c73b\") " Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.750716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f86295e-c261-49f4-ae57-0c975f06c73b-operator-scripts\") pod \"7f86295e-c261-49f4-ae57-0c975f06c73b\" (UID: \"7f86295e-c261-49f4-ae57-0c975f06c73b\") " Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.750837 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzc66\" (UniqueName: \"kubernetes.io/projected/2799ee87-ddc8-4e43-abbb-26744d666d22-kube-api-access-nzc66\") pod \"2799ee87-ddc8-4e43-abbb-26744d666d22\" (UID: \"2799ee87-ddc8-4e43-abbb-26744d666d22\") " Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.752816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2799ee87-ddc8-4e43-abbb-26744d666d22-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2799ee87-ddc8-4e43-abbb-26744d666d22" (UID: "2799ee87-ddc8-4e43-abbb-26744d666d22"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.753362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f86295e-c261-49f4-ae57-0c975f06c73b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f86295e-c261-49f4-ae57-0c975f06c73b" (UID: "7f86295e-c261-49f4-ae57-0c975f06c73b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.757567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f86295e-c261-49f4-ae57-0c975f06c73b-kube-api-access-fk8lr" (OuterVolumeSpecName: "kube-api-access-fk8lr") pod "7f86295e-c261-49f4-ae57-0c975f06c73b" (UID: "7f86295e-c261-49f4-ae57-0c975f06c73b"). InnerVolumeSpecName "kube-api-access-fk8lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.758355 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2799ee87-ddc8-4e43-abbb-26744d666d22-kube-api-access-nzc66" (OuterVolumeSpecName: "kube-api-access-nzc66") pod "2799ee87-ddc8-4e43-abbb-26744d666d22" (UID: "2799ee87-ddc8-4e43-abbb-26744d666d22"). InnerVolumeSpecName "kube-api-access-nzc66". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.852128 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk8lr\" (UniqueName: \"kubernetes.io/projected/7f86295e-c261-49f4-ae57-0c975f06c73b-kube-api-access-fk8lr\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.852160 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f86295e-c261-49f4-ae57-0c975f06c73b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.852170 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzc66\" (UniqueName: \"kubernetes.io/projected/2799ee87-ddc8-4e43-abbb-26744d666d22-kube-api-access-nzc66\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:55 crc kubenswrapper[4749]: I1129 02:56:55.852179 4749 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2799ee87-ddc8-4e43-abbb-26744d666d22-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:56 crc kubenswrapper[4749]: I1129 02:56:56.118870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-0243-account-create-update-r6mf6" event={"ID":"7f86295e-c261-49f4-ae57-0c975f06c73b","Type":"ContainerDied","Data":"99377d895f3048478fab9ff4e6216a4743f449473f31e2d73107dd448cee2c9c"} Nov 29 02:56:56 crc kubenswrapper[4749]: I1129 02:56:56.118926 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-0243-account-create-update-r6mf6" Nov 29 02:56:56 crc kubenswrapper[4749]: I1129 02:56:56.118961 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99377d895f3048478fab9ff4e6216a4743f449473f31e2d73107dd448cee2c9c" Nov 29 02:56:56 crc kubenswrapper[4749]: I1129 02:56:56.121898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-4c2zm" event={"ID":"2799ee87-ddc8-4e43-abbb-26744d666d22","Type":"ContainerDied","Data":"b121c587dacfe43757d25f0e18a3e2a2ce979c5e6e71e3267a503b8534007044"} Nov 29 02:56:56 crc kubenswrapper[4749]: I1129 02:56:56.121952 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b121c587dacfe43757d25f0e18a3e2a2ce979c5e6e71e3267a503b8534007044" Nov 29 02:56:56 crc kubenswrapper[4749]: I1129 02:56:56.121998 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-4c2zm" Nov 29 02:56:56 crc kubenswrapper[4749]: I1129 02:56:56.953036 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.031709 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-984d6"] Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.167006 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-984d6" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerName="registry-server" containerID="cri-o://57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef" gracePeriod=2 Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.187095 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-5ljqf"] Nov 29 02:56:57 crc kubenswrapper[4749]: E1129 02:56:57.189616 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f86295e-c261-49f4-ae57-0c975f06c73b" containerName="mariadb-account-create-update" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.189641 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f86295e-c261-49f4-ae57-0c975f06c73b" containerName="mariadb-account-create-update" Nov 29 02:56:57 crc kubenswrapper[4749]: E1129 02:56:57.189671 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2799ee87-ddc8-4e43-abbb-26744d666d22" containerName="mariadb-database-create" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.189679 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2799ee87-ddc8-4e43-abbb-26744d666d22" containerName="mariadb-database-create" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.190733 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2799ee87-ddc8-4e43-abbb-26744d666d22" containerName="mariadb-database-create" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.190763 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f86295e-c261-49f4-ae57-0c975f06c73b" containerName="mariadb-account-create-update" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.192373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.198787 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-tpcdp" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.198975 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.212452 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-5ljqf"] Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.297783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-config-data\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.297855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-combined-ca-bundle\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.297890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-job-config-data\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.297991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcmtf\" (UniqueName: \"kubernetes.io/projected/92025c44-2545-4dc1-82f8-822ce5da38d6-kube-api-access-xcmtf\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.399962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcmtf\" (UniqueName: \"kubernetes.io/projected/92025c44-2545-4dc1-82f8-822ce5da38d6-kube-api-access-xcmtf\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.400052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-config-data\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.400091 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-combined-ca-bundle\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.400119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-job-config-data\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.405869 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-combined-ca-bundle\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.406378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-config-data\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.407647 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-job-config-data\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.417030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcmtf\" (UniqueName: \"kubernetes.io/projected/92025c44-2545-4dc1-82f8-822ce5da38d6-kube-api-access-xcmtf\") pod \"manila-db-sync-5ljqf\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.573109 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-5ljqf" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.724693 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.810679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-utilities\") pod \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.811036 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-utilities" (OuterVolumeSpecName: "utilities") pod "9006334e-6cca-4531-95a8-6bc7b2fa7b4d" (UID: "9006334e-6cca-4531-95a8-6bc7b2fa7b4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.811387 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-catalog-content\") pod \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.811478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npswl\" (UniqueName: \"kubernetes.io/projected/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-kube-api-access-npswl\") pod \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\" (UID: \"9006334e-6cca-4531-95a8-6bc7b2fa7b4d\") " Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.812858 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.816901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-kube-api-access-npswl" (OuterVolumeSpecName: "kube-api-access-npswl") pod "9006334e-6cca-4531-95a8-6bc7b2fa7b4d" (UID: "9006334e-6cca-4531-95a8-6bc7b2fa7b4d"). InnerVolumeSpecName "kube-api-access-npswl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.901039 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9006334e-6cca-4531-95a8-6bc7b2fa7b4d" (UID: "9006334e-6cca-4531-95a8-6bc7b2fa7b4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.914662 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:57 crc kubenswrapper[4749]: I1129 02:56:57.914700 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npswl\" (UniqueName: \"kubernetes.io/projected/9006334e-6cca-4531-95a8-6bc7b2fa7b4d-kube-api-access-npswl\") on node \"crc\" DevicePath \"\"" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.177716 4749 generic.go:334] "Generic (PLEG): container finished" podID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerID="57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef" exitCode=0 Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.177862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-984d6" event={"ID":"9006334e-6cca-4531-95a8-6bc7b2fa7b4d","Type":"ContainerDied","Data":"57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef"} Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.178763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-984d6" event={"ID":"9006334e-6cca-4531-95a8-6bc7b2fa7b4d","Type":"ContainerDied","Data":"ae9d190e58bbcfd66e134b93237f50b07012f844fcc84aeb5a3f84f2ec8adac7"} Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.178827 4749 scope.go:117] "RemoveContainer" containerID="57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.177935 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-984d6" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.213769 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-5ljqf"] Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.218613 4749 scope.go:117] "RemoveContainer" containerID="f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd" Nov 29 02:56:58 crc kubenswrapper[4749]: W1129 02:56:58.232558 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92025c44_2545_4dc1_82f8_822ce5da38d6.slice/crio-d0e71df2ae5bfe2fadf5a59ad59f0c2f32c381f13bc96979ba0c05d693b37e02 WatchSource:0}: Error finding container d0e71df2ae5bfe2fadf5a59ad59f0c2f32c381f13bc96979ba0c05d693b37e02: Status 404 returned error can't find the container with id d0e71df2ae5bfe2fadf5a59ad59f0c2f32c381f13bc96979ba0c05d693b37e02 Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.242892 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-984d6"] Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.253505 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-984d6"] Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.264670 4749 scope.go:117] "RemoveContainer" containerID="88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.294373 4749 scope.go:117] "RemoveContainer" containerID="57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef" Nov 29 02:56:58 crc kubenswrapper[4749]: E1129 02:56:58.294992 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef\": container with ID starting with 57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef not found: ID does not exist" containerID="57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.295031 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef"} err="failed to get container status \"57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef\": rpc error: code = NotFound desc = could not find container \"57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef\": container with ID starting with 57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef not found: ID does not exist" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.295056 4749 scope.go:117] "RemoveContainer" containerID="f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd" Nov 29 02:56:58 crc kubenswrapper[4749]: E1129 02:56:58.295477 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd\": container with ID starting with f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd not found: ID does not exist" containerID="f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.295501 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd"} err="failed to get container status \"f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd\": rpc error: code = NotFound desc = could not find container \"f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd\": container with ID starting with f3d60d027e397bf20f94c56bab0e1a9416d35b1db7a54996836d3fdf0ce5bbfd not found: ID does not exist" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.295517 4749 scope.go:117] "RemoveContainer" containerID="88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5" Nov 29 02:56:58 crc kubenswrapper[4749]: E1129 02:56:58.295824 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5\": container with ID starting with 88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5 not found: ID does not exist" containerID="88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5" Nov 29 02:56:58 crc kubenswrapper[4749]: I1129 02:56:58.295858 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5"} err="failed to get container status \"88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5\": rpc error: code = NotFound desc = could not find container \"88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5\": container with ID starting with 88a7f8d0a70d4db3bb30f87a970b7b5be68398cfc648ed68b77d30a54d45abe5 not found: ID does not exist" Nov 29 02:56:59 crc kubenswrapper[4749]: I1129 02:56:59.089911 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" path="/var/lib/kubelet/pods/9006334e-6cca-4531-95a8-6bc7b2fa7b4d/volumes" Nov 29 02:56:59 crc kubenswrapper[4749]: I1129 02:56:59.192992 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-5ljqf" event={"ID":"92025c44-2545-4dc1-82f8-822ce5da38d6","Type":"ContainerStarted","Data":"d0e71df2ae5bfe2fadf5a59ad59f0c2f32c381f13bc96979ba0c05d693b37e02"} Nov 29 02:57:03 crc kubenswrapper[4749]: I1129 02:57:03.243754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-5ljqf" event={"ID":"92025c44-2545-4dc1-82f8-822ce5da38d6","Type":"ContainerStarted","Data":"7b70211361ce7ecc3c19913935fbb34ebe80deb787d90dfe63df6fa5d8b02181"} Nov 29 02:57:03 crc kubenswrapper[4749]: I1129 02:57:03.281404 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-5ljqf" podStartSLOduration=2.453322748 podStartE2EDuration="6.281377678s" podCreationTimestamp="2025-11-29 02:56:57 +0000 UTC" firstStartedPulling="2025-11-29 02:56:58.235883953 +0000 UTC m=+6361.408033810" lastFinishedPulling="2025-11-29 02:57:02.063938873 +0000 UTC m=+6365.236088740" observedRunningTime="2025-11-29 02:57:03.271359195 +0000 UTC m=+6366.443509112" watchObservedRunningTime="2025-11-29 02:57:03.281377678 +0000 UTC m=+6366.453527565" Nov 29 02:57:04 crc kubenswrapper[4749]: I1129 02:57:04.258010 4749 generic.go:334] "Generic (PLEG): container finished" podID="92025c44-2545-4dc1-82f8-822ce5da38d6" containerID="7b70211361ce7ecc3c19913935fbb34ebe80deb787d90dfe63df6fa5d8b02181" exitCode=0 Nov 29 02:57:04 crc kubenswrapper[4749]: I1129 02:57:04.258080 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-5ljqf" event={"ID":"92025c44-2545-4dc1-82f8-822ce5da38d6","Type":"ContainerDied","Data":"7b70211361ce7ecc3c19913935fbb34ebe80deb787d90dfe63df6fa5d8b02181"} Nov 29 02:57:05 crc kubenswrapper[4749]: I1129 02:57:05.057903 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4872k"] Nov 29 02:57:05 crc kubenswrapper[4749]: I1129 02:57:05.071811 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4872k"] Nov 29 02:57:05 crc kubenswrapper[4749]: I1129 02:57:05.089824 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f1aab52-f574-4c9e-aef6-735905de460f" path="/var/lib/kubelet/pods/3f1aab52-f574-4c9e-aef6-735905de460f/volumes" Nov 29 02:57:05 crc kubenswrapper[4749]: E1129 02:57:05.449930 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-conmon-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache]" Nov 29 02:57:05 crc kubenswrapper[4749]: I1129 02:57:05.833039 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-5ljqf" Nov 29 02:57:05 crc kubenswrapper[4749]: I1129 02:57:05.996570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-job-config-data\") pod \"92025c44-2545-4dc1-82f8-822ce5da38d6\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " Nov 29 02:57:05 crc kubenswrapper[4749]: I1129 02:57:05.996624 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcmtf\" (UniqueName: \"kubernetes.io/projected/92025c44-2545-4dc1-82f8-822ce5da38d6-kube-api-access-xcmtf\") pod \"92025c44-2545-4dc1-82f8-822ce5da38d6\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " Nov 29 02:57:05 crc kubenswrapper[4749]: I1129 02:57:05.996691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-combined-ca-bundle\") pod \"92025c44-2545-4dc1-82f8-822ce5da38d6\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " Nov 29 02:57:05 crc kubenswrapper[4749]: I1129 02:57:05.996791 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-config-data\") pod \"92025c44-2545-4dc1-82f8-822ce5da38d6\" (UID: \"92025c44-2545-4dc1-82f8-822ce5da38d6\") " Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.002871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "92025c44-2545-4dc1-82f8-822ce5da38d6" (UID: "92025c44-2545-4dc1-82f8-822ce5da38d6"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.002916 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92025c44-2545-4dc1-82f8-822ce5da38d6-kube-api-access-xcmtf" (OuterVolumeSpecName: "kube-api-access-xcmtf") pod "92025c44-2545-4dc1-82f8-822ce5da38d6" (UID: "92025c44-2545-4dc1-82f8-822ce5da38d6"). InnerVolumeSpecName "kube-api-access-xcmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.006233 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-config-data" (OuterVolumeSpecName: "config-data") pod "92025c44-2545-4dc1-82f8-822ce5da38d6" (UID: "92025c44-2545-4dc1-82f8-822ce5da38d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.043436 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92025c44-2545-4dc1-82f8-822ce5da38d6" (UID: "92025c44-2545-4dc1-82f8-822ce5da38d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.099066 4749 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-job-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.099101 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcmtf\" (UniqueName: \"kubernetes.io/projected/92025c44-2545-4dc1-82f8-822ce5da38d6-kube-api-access-xcmtf\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.099111 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.099121 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92025c44-2545-4dc1-82f8-822ce5da38d6-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.282062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-5ljqf" event={"ID":"92025c44-2545-4dc1-82f8-822ce5da38d6","Type":"ContainerDied","Data":"d0e71df2ae5bfe2fadf5a59ad59f0c2f32c381f13bc96979ba0c05d693b37e02"} Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.282095 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0e71df2ae5bfe2fadf5a59ad59f0c2f32c381f13bc96979ba0c05d693b37e02" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.282182 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-5ljqf" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.681808 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 29 02:57:06 crc kubenswrapper[4749]: E1129 02:57:06.682564 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerName="registry-server" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.682580 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerName="registry-server" Nov 29 02:57:06 crc kubenswrapper[4749]: E1129 02:57:06.682615 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerName="extract-utilities" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.682623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerName="extract-utilities" Nov 29 02:57:06 crc kubenswrapper[4749]: E1129 02:57:06.682652 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92025c44-2545-4dc1-82f8-822ce5da38d6" containerName="manila-db-sync" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.682661 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="92025c44-2545-4dc1-82f8-822ce5da38d6" containerName="manila-db-sync" Nov 29 02:57:06 crc kubenswrapper[4749]: E1129 02:57:06.682682 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerName="extract-content" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.682689 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerName="extract-content" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.682926 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9006334e-6cca-4531-95a8-6bc7b2fa7b4d" containerName="registry-server" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.682951 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="92025c44-2545-4dc1-82f8-822ce5da38d6" containerName="manila-db-sync" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.684248 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.687438 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.687880 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.691250 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.694155 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-tpcdp" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.702073 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.831416 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4fb28a84-0189-40d7-9be0-5de128a0290c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.831494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddmn5\" (UniqueName: \"kubernetes.io/projected/4fb28a84-0189-40d7-9be0-5de128a0290c-kube-api-access-ddmn5\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.831521 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-scripts\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.831590 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-config-data\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.831638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.831656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.862871 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.864910 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.867127 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.888292 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.931830 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-546989675f-phc7t"] Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.939210 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4fb28a84-0189-40d7-9be0-5de128a0290c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.939349 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddmn5\" (UniqueName: \"kubernetes.io/projected/4fb28a84-0189-40d7-9be0-5de128a0290c-kube-api-access-ddmn5\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.939390 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-scripts\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.940323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-config-data\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.940431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.940468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.941367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4fb28a84-0189-40d7-9be0-5de128a0290c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.964207 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.970102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.970629 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-scripts\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.972242 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.987961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddmn5\" (UniqueName: \"kubernetes.io/projected/4fb28a84-0189-40d7-9be0-5de128a0290c-kube-api-access-ddmn5\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.995904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb28a84-0189-40d7-9be0-5de128a0290c-config-data\") pod \"manila-scheduler-0\" (UID: \"4fb28a84-0189-40d7-9be0-5de128a0290c\") " pod="openstack/manila-scheduler-0" Nov 29 02:57:06 crc kubenswrapper[4749]: I1129 02:57:06.997765 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546989675f-phc7t"] Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.042819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-ceph\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.043749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68dxc\" (UniqueName: \"kubernetes.io/projected/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-kube-api-access-68dxc\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.043781 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-scripts\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.043809 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.043902 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.043922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.043943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-config-data\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.043970 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.046739 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.141177 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.142935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.144301 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.145829 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-config\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.145866 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-sb\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.145917 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.145937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.145955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-config-data\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.145977 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.146001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-ceph\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.146123 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.145952 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.147119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.148224 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z88q\" (UniqueName: \"kubernetes.io/projected/c131e997-2d41-4b8b-b10b-07d028170709-kube-api-access-9z88q\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.148306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-nb\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.148331 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-dns-svc\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.148470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68dxc\" (UniqueName: \"kubernetes.io/projected/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-kube-api-access-68dxc\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.148486 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-scripts\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.148516 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.149125 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.149358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-ceph\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.151523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-scripts\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.164977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-config-data\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.180977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68dxc\" (UniqueName: \"kubernetes.io/projected/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-kube-api-access-68dxc\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.184851 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8bba430-1fea-4c72-a7cd-bdb8f6d91533-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e8bba430-1fea-4c72-a7cd-bdb8f6d91533\") " pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250563 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-scripts\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/460434e5-fafa-4ef1-b56d-f266f28c6a76-etc-machine-id\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250715 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z88q\" (UniqueName: \"kubernetes.io/projected/c131e997-2d41-4b8b-b10b-07d028170709-kube-api-access-9z88q\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/460434e5-fafa-4ef1-b56d-f266f28c6a76-logs\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-nb\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-dns-svc\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-config-data\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.250885 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-config-data-custom\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.251116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvvpw\" (UniqueName: \"kubernetes.io/projected/460434e5-fafa-4ef1-b56d-f266f28c6a76-kube-api-access-cvvpw\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.251148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-config\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.251171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-sb\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.252641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-nb\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.252800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-dns-svc\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.257079 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-config\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.257613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-sb\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.257639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.280904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z88q\" (UniqueName: \"kubernetes.io/projected/c131e997-2d41-4b8b-b10b-07d028170709-kube-api-access-9z88q\") pod \"dnsmasq-dns-546989675f-phc7t\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.354511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.354545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-scripts\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.354583 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/460434e5-fafa-4ef1-b56d-f266f28c6a76-etc-machine-id\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.354620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/460434e5-fafa-4ef1-b56d-f266f28c6a76-logs\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.354673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-config-data\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.354695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-config-data-custom\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.354724 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvvpw\" (UniqueName: \"kubernetes.io/projected/460434e5-fafa-4ef1-b56d-f266f28c6a76-kube-api-access-cvvpw\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.355464 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/460434e5-fafa-4ef1-b56d-f266f28c6a76-logs\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.355507 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/460434e5-fafa-4ef1-b56d-f266f28c6a76-etc-machine-id\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.363431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-config-data\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.366025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-scripts\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.369404 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.369915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/460434e5-fafa-4ef1-b56d-f266f28c6a76-config-data-custom\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.381486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvvpw\" (UniqueName: \"kubernetes.io/projected/460434e5-fafa-4ef1-b56d-f266f28c6a76-kube-api-access-cvvpw\") pod \"manila-api-0\" (UID: \"460434e5-fafa-4ef1-b56d-f266f28c6a76\") " pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.563798 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.569122 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.753391 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 29 02:57:07 crc kubenswrapper[4749]: W1129 02:57:07.765191 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fb28a84_0189_40d7_9be0_5de128a0290c.slice/crio-21aca2bdfac9185a003813b25c30f1b43001b10234a152b2f9e8d47a4f8ccc49 WatchSource:0}: Error finding container 21aca2bdfac9185a003813b25c30f1b43001b10234a152b2f9e8d47a4f8ccc49: Status 404 returned error can't find the container with id 21aca2bdfac9185a003813b25c30f1b43001b10234a152b2f9e8d47a4f8ccc49 Nov 29 02:57:07 crc kubenswrapper[4749]: I1129 02:57:07.957410 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 29 02:57:08 crc kubenswrapper[4749]: I1129 02:57:08.164956 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546989675f-phc7t"] Nov 29 02:57:08 crc kubenswrapper[4749]: W1129 02:57:08.174958 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc131e997_2d41_4b8b_b10b_07d028170709.slice/crio-67250c1cd7f41e5622f3fa5cd544525017153346af742440fb47f330a6dea4d5 WatchSource:0}: Error finding container 67250c1cd7f41e5622f3fa5cd544525017153346af742440fb47f330a6dea4d5: Status 404 returned error can't find the container with id 67250c1cd7f41e5622f3fa5cd544525017153346af742440fb47f330a6dea4d5 Nov 29 02:57:08 crc kubenswrapper[4749]: I1129 02:57:08.348862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4fb28a84-0189-40d7-9be0-5de128a0290c","Type":"ContainerStarted","Data":"21aca2bdfac9185a003813b25c30f1b43001b10234a152b2f9e8d47a4f8ccc49"} Nov 29 02:57:08 crc kubenswrapper[4749]: I1129 02:57:08.351109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e8bba430-1fea-4c72-a7cd-bdb8f6d91533","Type":"ContainerStarted","Data":"c7e569191baf1a54a042c66b6ed1b0dc68741042f76224a087f1dfbd3cc3eae8"} Nov 29 02:57:08 crc kubenswrapper[4749]: I1129 02:57:08.354188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546989675f-phc7t" event={"ID":"c131e997-2d41-4b8b-b10b-07d028170709","Type":"ContainerStarted","Data":"67250c1cd7f41e5622f3fa5cd544525017153346af742440fb47f330a6dea4d5"} Nov 29 02:57:08 crc kubenswrapper[4749]: I1129 02:57:08.372338 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 29 02:57:09 crc kubenswrapper[4749]: I1129 02:57:09.371778 4749 generic.go:334] "Generic (PLEG): container finished" podID="c131e997-2d41-4b8b-b10b-07d028170709" containerID="7c26699b16d299fc549fc3bc48472a7dd5faa53930cc7c653970f2c2869f89e5" exitCode=0 Nov 29 02:57:09 crc kubenswrapper[4749]: I1129 02:57:09.371961 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546989675f-phc7t" event={"ID":"c131e997-2d41-4b8b-b10b-07d028170709","Type":"ContainerDied","Data":"7c26699b16d299fc549fc3bc48472a7dd5faa53930cc7c653970f2c2869f89e5"} Nov 29 02:57:09 crc kubenswrapper[4749]: I1129 02:57:09.375890 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"460434e5-fafa-4ef1-b56d-f266f28c6a76","Type":"ContainerStarted","Data":"a72a22ab1ca74c4ffe94c727c9c6176fa9c73a8961759efbfe102a5218ac32ee"} Nov 29 02:57:09 crc kubenswrapper[4749]: I1129 02:57:09.375936 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"460434e5-fafa-4ef1-b56d-f266f28c6a76","Type":"ContainerStarted","Data":"2ea7b6461f1d48917597fe88ef5c617f530e6e5013f9bdcd0068b002f520858b"} Nov 29 02:57:09 crc kubenswrapper[4749]: I1129 02:57:09.394040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4fb28a84-0189-40d7-9be0-5de128a0290c","Type":"ContainerStarted","Data":"db8d997974146ba28f71866c53627c2d2d7e8c641d551bee31fb352ef2f0d379"} Nov 29 02:57:10 crc kubenswrapper[4749]: I1129 02:57:10.415716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546989675f-phc7t" event={"ID":"c131e997-2d41-4b8b-b10b-07d028170709","Type":"ContainerStarted","Data":"1b756fe6e412e4b37d27dfca6f93ba2390f35d15ef4237393519ba499966ee0a"} Nov 29 02:57:10 crc kubenswrapper[4749]: I1129 02:57:10.416097 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:10 crc kubenswrapper[4749]: I1129 02:57:10.419291 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"460434e5-fafa-4ef1-b56d-f266f28c6a76","Type":"ContainerStarted","Data":"51e9efcfa30e26af20cdd1a0ce679fbcd1958672f6a58e8af24608ffceedacd8"} Nov 29 02:57:10 crc kubenswrapper[4749]: I1129 02:57:10.419421 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 29 02:57:10 crc kubenswrapper[4749]: I1129 02:57:10.421565 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4fb28a84-0189-40d7-9be0-5de128a0290c","Type":"ContainerStarted","Data":"0c5c88aa629f8fced7d72c3af319c083b3146159293efa0272475a88bccf6906"} Nov 29 02:57:10 crc kubenswrapper[4749]: I1129 02:57:10.435855 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-546989675f-phc7t" podStartSLOduration=4.435839184 podStartE2EDuration="4.435839184s" podCreationTimestamp="2025-11-29 02:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:57:10.433535018 +0000 UTC m=+6373.605684875" watchObservedRunningTime="2025-11-29 02:57:10.435839184 +0000 UTC m=+6373.607989041" Nov 29 02:57:10 crc kubenswrapper[4749]: I1129 02:57:10.453480 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.648526197 podStartE2EDuration="4.453466391s" podCreationTimestamp="2025-11-29 02:57:06 +0000 UTC" firstStartedPulling="2025-11-29 02:57:07.767723333 +0000 UTC m=+6370.939873180" lastFinishedPulling="2025-11-29 02:57:08.572663517 +0000 UTC m=+6371.744813374" observedRunningTime="2025-11-29 02:57:10.451188416 +0000 UTC m=+6373.623338273" watchObservedRunningTime="2025-11-29 02:57:10.453466391 +0000 UTC m=+6373.625616248" Nov 29 02:57:10 crc kubenswrapper[4749]: I1129 02:57:10.473889 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.473868967 podStartE2EDuration="3.473868967s" podCreationTimestamp="2025-11-29 02:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:57:10.471447648 +0000 UTC m=+6373.643597505" watchObservedRunningTime="2025-11-29 02:57:10.473868967 +0000 UTC m=+6373.646018824" Nov 29 02:57:12 crc kubenswrapper[4749]: I1129 02:57:12.379151 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 02:57:14 crc kubenswrapper[4749]: I1129 02:57:14.483346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e8bba430-1fea-4c72-a7cd-bdb8f6d91533","Type":"ContainerStarted","Data":"c767bbf2132a4944257b2c10d5a3d51ec3ada087f8c550e20123183d614a5a7d"} Nov 29 02:57:15 crc kubenswrapper[4749]: I1129 02:57:15.503636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e8bba430-1fea-4c72-a7cd-bdb8f6d91533","Type":"ContainerStarted","Data":"bacd26ec3b1ff18ebc3368955a1bc64709f0471ca5405a6b6180983d57c388a3"} Nov 29 02:57:15 crc kubenswrapper[4749]: I1129 02:57:15.550078 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.89100679 podStartE2EDuration="9.550058336s" podCreationTimestamp="2025-11-29 02:57:06 +0000 UTC" firstStartedPulling="2025-11-29 02:57:07.958044671 +0000 UTC m=+6371.130194528" lastFinishedPulling="2025-11-29 02:57:13.617096217 +0000 UTC m=+6376.789246074" observedRunningTime="2025-11-29 02:57:15.538563867 +0000 UTC m=+6378.710713784" watchObservedRunningTime="2025-11-29 02:57:15.550058336 +0000 UTC m=+6378.722208203" Nov 29 02:57:15 crc kubenswrapper[4749]: E1129 02:57:15.808087 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-conmon-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache]" Nov 29 02:57:17 crc kubenswrapper[4749]: I1129 02:57:17.048015 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 29 02:57:17 crc kubenswrapper[4749]: I1129 02:57:17.258622 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 29 02:57:17 crc kubenswrapper[4749]: I1129 02:57:17.565707 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:57:17 crc kubenswrapper[4749]: I1129 02:57:17.663395 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bc657d5c-djh2c"] Nov 29 02:57:17 crc kubenswrapper[4749]: I1129 02:57:17.663735 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" podUID="920be425-9042-4830-8468-6dd624a20d43" containerName="dnsmasq-dns" containerID="cri-o://3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8" gracePeriod=10 Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.241138 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.420306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-dns-svc\") pod \"920be425-9042-4830-8468-6dd624a20d43\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.420379 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-nb\") pod \"920be425-9042-4830-8468-6dd624a20d43\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.420672 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-config\") pod \"920be425-9042-4830-8468-6dd624a20d43\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.420779 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dqbn\" (UniqueName: \"kubernetes.io/projected/920be425-9042-4830-8468-6dd624a20d43-kube-api-access-9dqbn\") pod \"920be425-9042-4830-8468-6dd624a20d43\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.420956 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-sb\") pod \"920be425-9042-4830-8468-6dd624a20d43\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.429737 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920be425-9042-4830-8468-6dd624a20d43-kube-api-access-9dqbn" (OuterVolumeSpecName: "kube-api-access-9dqbn") pod "920be425-9042-4830-8468-6dd624a20d43" (UID: "920be425-9042-4830-8468-6dd624a20d43"). InnerVolumeSpecName "kube-api-access-9dqbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.489480 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "920be425-9042-4830-8468-6dd624a20d43" (UID: "920be425-9042-4830-8468-6dd624a20d43"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.492853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-config" (OuterVolumeSpecName: "config") pod "920be425-9042-4830-8468-6dd624a20d43" (UID: "920be425-9042-4830-8468-6dd624a20d43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.497600 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "920be425-9042-4830-8468-6dd624a20d43" (UID: "920be425-9042-4830-8468-6dd624a20d43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.522823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "920be425-9042-4830-8468-6dd624a20d43" (UID: "920be425-9042-4830-8468-6dd624a20d43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.523593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-nb\") pod \"920be425-9042-4830-8468-6dd624a20d43\" (UID: \"920be425-9042-4830-8468-6dd624a20d43\") " Nov 29 02:57:18 crc kubenswrapper[4749]: W1129 02:57:18.523801 4749 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/920be425-9042-4830-8468-6dd624a20d43/volumes/kubernetes.io~configmap/ovsdbserver-nb Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.523829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "920be425-9042-4830-8468-6dd624a20d43" (UID: "920be425-9042-4830-8468-6dd624a20d43"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.524312 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.524346 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.524364 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.524382 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dqbn\" (UniqueName: \"kubernetes.io/projected/920be425-9042-4830-8468-6dd624a20d43-kube-api-access-9dqbn\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.524394 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/920be425-9042-4830-8468-6dd624a20d43-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.539900 4749 generic.go:334] "Generic (PLEG): container finished" podID="920be425-9042-4830-8468-6dd624a20d43" containerID="3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8" exitCode=0 Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.540024 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.540089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" event={"ID":"920be425-9042-4830-8468-6dd624a20d43","Type":"ContainerDied","Data":"3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8"} Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.540126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bc657d5c-djh2c" event={"ID":"920be425-9042-4830-8468-6dd624a20d43","Type":"ContainerDied","Data":"49b8c7db550ab7b46f364cc04956fd51686affc7fa1f586b44149ad9db7b076d"} Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.540146 4749 scope.go:117] "RemoveContainer" containerID="3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.574530 4749 scope.go:117] "RemoveContainer" containerID="f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.589541 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bc657d5c-djh2c"] Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.601152 4749 scope.go:117] "RemoveContainer" containerID="3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8" Nov 29 02:57:18 crc kubenswrapper[4749]: E1129 02:57:18.601561 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8\": container with ID starting with 3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8 not found: ID does not exist" containerID="3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.601608 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8"} err="failed to get container status \"3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8\": rpc error: code = NotFound desc = could not find container \"3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8\": container with ID starting with 3ded07dd997b136310754f475d8017a3b5073be90d7d54ed5338b8a8086beda8 not found: ID does not exist" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.601630 4749 scope.go:117] "RemoveContainer" containerID="f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed" Nov 29 02:57:18 crc kubenswrapper[4749]: E1129 02:57:18.601920 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed\": container with ID starting with f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed not found: ID does not exist" containerID="f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.601961 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed"} err="failed to get container status \"f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed\": rpc error: code = NotFound desc = could not find container \"f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed\": container with ID starting with f3812a9abb45c6496bb4f37d755d936ce62bf95ad5ce29c2c6b2d4797536b0ed not found: ID does not exist" Nov 29 02:57:18 crc kubenswrapper[4749]: I1129 02:57:18.601937 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bc657d5c-djh2c"] Nov 29 02:57:19 crc kubenswrapper[4749]: I1129 02:57:19.097183 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="920be425-9042-4830-8468-6dd624a20d43" path="/var/lib/kubelet/pods/920be425-9042-4830-8468-6dd624a20d43/volumes" Nov 29 02:57:19 crc kubenswrapper[4749]: I1129 02:57:19.599572 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:57:19 crc kubenswrapper[4749]: I1129 02:57:19.599869 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="ceilometer-central-agent" containerID="cri-o://eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4" gracePeriod=30 Nov 29 02:57:19 crc kubenswrapper[4749]: I1129 02:57:19.600359 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="proxy-httpd" containerID="cri-o://a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38" gracePeriod=30 Nov 29 02:57:19 crc kubenswrapper[4749]: I1129 02:57:19.600422 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="sg-core" containerID="cri-o://c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5" gracePeriod=30 Nov 29 02:57:19 crc kubenswrapper[4749]: I1129 02:57:19.600465 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="ceilometer-notification-agent" containerID="cri-o://532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe" gracePeriod=30 Nov 29 02:57:20 crc kubenswrapper[4749]: I1129 02:57:20.562614 4749 generic.go:334] "Generic (PLEG): container finished" podID="eef6211e-89be-4693-83c8-9b077602da01" containerID="a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38" exitCode=0 Nov 29 02:57:20 crc kubenswrapper[4749]: I1129 02:57:20.562936 4749 generic.go:334] "Generic (PLEG): container finished" podID="eef6211e-89be-4693-83c8-9b077602da01" containerID="c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5" exitCode=2 Nov 29 02:57:20 crc kubenswrapper[4749]: I1129 02:57:20.562948 4749 generic.go:334] "Generic (PLEG): container finished" podID="eef6211e-89be-4693-83c8-9b077602da01" containerID="eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4" exitCode=0 Nov 29 02:57:20 crc kubenswrapper[4749]: I1129 02:57:20.562652 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerDied","Data":"a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38"} Nov 29 02:57:20 crc kubenswrapper[4749]: I1129 02:57:20.562994 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerDied","Data":"c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5"} Nov 29 02:57:20 crc kubenswrapper[4749]: I1129 02:57:20.563010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerDied","Data":"eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4"} Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.286141 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.393635 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-combined-ca-bundle\") pod \"eef6211e-89be-4693-83c8-9b077602da01\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.393686 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhbf7\" (UniqueName: \"kubernetes.io/projected/eef6211e-89be-4693-83c8-9b077602da01-kube-api-access-fhbf7\") pod \"eef6211e-89be-4693-83c8-9b077602da01\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.393853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-run-httpd\") pod \"eef6211e-89be-4693-83c8-9b077602da01\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.393873 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-config-data\") pod \"eef6211e-89be-4693-83c8-9b077602da01\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.394170 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eef6211e-89be-4693-83c8-9b077602da01" (UID: "eef6211e-89be-4693-83c8-9b077602da01"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.394242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-sg-core-conf-yaml\") pod \"eef6211e-89be-4693-83c8-9b077602da01\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.394550 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-log-httpd\") pod \"eef6211e-89be-4693-83c8-9b077602da01\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.394710 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-scripts\") pod \"eef6211e-89be-4693-83c8-9b077602da01\" (UID: \"eef6211e-89be-4693-83c8-9b077602da01\") " Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.395212 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.396818 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eef6211e-89be-4693-83c8-9b077602da01" (UID: "eef6211e-89be-4693-83c8-9b077602da01"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.405374 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef6211e-89be-4693-83c8-9b077602da01-kube-api-access-fhbf7" (OuterVolumeSpecName: "kube-api-access-fhbf7") pod "eef6211e-89be-4693-83c8-9b077602da01" (UID: "eef6211e-89be-4693-83c8-9b077602da01"). InnerVolumeSpecName "kube-api-access-fhbf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.405446 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-scripts" (OuterVolumeSpecName: "scripts") pod "eef6211e-89be-4693-83c8-9b077602da01" (UID: "eef6211e-89be-4693-83c8-9b077602da01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.421748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eef6211e-89be-4693-83c8-9b077602da01" (UID: "eef6211e-89be-4693-83c8-9b077602da01"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.477365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef6211e-89be-4693-83c8-9b077602da01" (UID: "eef6211e-89be-4693-83c8-9b077602da01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.496823 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-scripts\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.496849 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.496860 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhbf7\" (UniqueName: \"kubernetes.io/projected/eef6211e-89be-4693-83c8-9b077602da01-kube-api-access-fhbf7\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.496869 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.496877 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eef6211e-89be-4693-83c8-9b077602da01-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.503630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-config-data" (OuterVolumeSpecName: "config-data") pod "eef6211e-89be-4693-83c8-9b077602da01" (UID: "eef6211e-89be-4693-83c8-9b077602da01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.577982 4749 generic.go:334] "Generic (PLEG): container finished" podID="eef6211e-89be-4693-83c8-9b077602da01" containerID="532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe" exitCode=0 Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.578020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerDied","Data":"532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe"} Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.578044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eef6211e-89be-4693-83c8-9b077602da01","Type":"ContainerDied","Data":"ad7482909264a43500cd2873e01fc042f110bafe7b87fefa935ccebee1614bbd"} Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.578061 4749 scope.go:117] "RemoveContainer" containerID="a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.578180 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.598411 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef6211e-89be-4693-83c8-9b077602da01-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.614334 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.625868 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.637339 4749 scope.go:117] "RemoveContainer" containerID="c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.646978 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.647588 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="ceilometer-central-agent" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.647623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="ceilometer-central-agent" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.647651 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="sg-core" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.647663 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="sg-core" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.647683 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="ceilometer-notification-agent" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.647694 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="ceilometer-notification-agent" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.647718 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920be425-9042-4830-8468-6dd624a20d43" containerName="dnsmasq-dns" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.647729 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="920be425-9042-4830-8468-6dd624a20d43" containerName="dnsmasq-dns" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.647760 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920be425-9042-4830-8468-6dd624a20d43" containerName="init" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.647770 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="920be425-9042-4830-8468-6dd624a20d43" containerName="init" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.647820 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="proxy-httpd" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.647831 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="proxy-httpd" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.648192 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="ceilometer-notification-agent" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.648279 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="920be425-9042-4830-8468-6dd624a20d43" containerName="dnsmasq-dns" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.648302 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="proxy-httpd" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.648327 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="sg-core" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.648359 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef6211e-89be-4693-83c8-9b077602da01" containerName="ceilometer-central-agent" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.651104 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.653665 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.653909 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.656950 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.660376 4749 scope.go:117] "RemoveContainer" containerID="532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.704872 4749 scope.go:117] "RemoveContainer" containerID="eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.726683 4749 scope.go:117] "RemoveContainer" containerID="a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.734425 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38\": container with ID starting with a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38 not found: ID does not exist" containerID="a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.734476 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38"} err="failed to get container status \"a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38\": rpc error: code = NotFound desc = could not find container \"a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38\": container with ID starting with a3f72ad281d24dad2cd57d473c0cda84778e3308d2147da3407ddce996814c38 not found: ID does not exist" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.734504 4749 scope.go:117] "RemoveContainer" containerID="c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.734968 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5\": container with ID starting with c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5 not found: ID does not exist" containerID="c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.735011 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5"} err="failed to get container status \"c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5\": rpc error: code = NotFound desc = could not find container \"c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5\": container with ID starting with c1f56d4002b7d8f0ed69d7a5d9e3b1aca6d28a9711bf4f343a0511265fe145f5 not found: ID does not exist" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.735038 4749 scope.go:117] "RemoveContainer" containerID="532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.735358 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe\": container with ID starting with 532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe not found: ID does not exist" containerID="532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.735390 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe"} err="failed to get container status \"532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe\": rpc error: code = NotFound desc = could not find container \"532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe\": container with ID starting with 532d24bb6ad5e24655848a81aa1db269f5d29e6cc83d90a68aea57b9fa4f38fe not found: ID does not exist" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.735410 4749 scope.go:117] "RemoveContainer" containerID="eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4" Nov 29 02:57:21 crc kubenswrapper[4749]: E1129 02:57:21.735856 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4\": container with ID starting with eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4 not found: ID does not exist" containerID="eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.735884 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4"} err="failed to get container status \"eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4\": rpc error: code = NotFound desc = could not find container \"eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4\": container with ID starting with eaffddab6d994ff024ca441d7aeafc01b4428003447032bd3055bda2bfb08ef4 not found: ID does not exist" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.801955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.802027 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-scripts\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.802426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.802462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9wqg\" (UniqueName: \"kubernetes.io/projected/b6b84c17-5e5b-4464-9890-31bb49853d6d-kube-api-access-t9wqg\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.802527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-config-data\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.802552 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b84c17-5e5b-4464-9890-31bb49853d6d-log-httpd\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.802575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b84c17-5e5b-4464-9890-31bb49853d6d-run-httpd\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.903748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-config-data\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.903803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b84c17-5e5b-4464-9890-31bb49853d6d-log-httpd\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.903832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b84c17-5e5b-4464-9890-31bb49853d6d-run-httpd\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.903856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.903895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-scripts\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.903980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.904006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9wqg\" (UniqueName: \"kubernetes.io/projected/b6b84c17-5e5b-4464-9890-31bb49853d6d-kube-api-access-t9wqg\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.904800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b84c17-5e5b-4464-9890-31bb49853d6d-log-httpd\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.904948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6b84c17-5e5b-4464-9890-31bb49853d6d-run-httpd\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.909170 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-scripts\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.909307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-config-data\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.910078 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.911110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6b84c17-5e5b-4464-9890-31bb49853d6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.919590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9wqg\" (UniqueName: \"kubernetes.io/projected/b6b84c17-5e5b-4464-9890-31bb49853d6d-kube-api-access-t9wqg\") pod \"ceilometer-0\" (UID: \"b6b84c17-5e5b-4464-9890-31bb49853d6d\") " pod="openstack/ceilometer-0" Nov 29 02:57:21 crc kubenswrapper[4749]: I1129 02:57:21.976557 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 29 02:57:22 crc kubenswrapper[4749]: W1129 02:57:22.494890 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6b84c17_5e5b_4464_9890_31bb49853d6d.slice/crio-b3f116ab8a7be05565afca7c8adf41982afbb1e1aac53acd7c0a69feda0d1abc WatchSource:0}: Error finding container b3f116ab8a7be05565afca7c8adf41982afbb1e1aac53acd7c0a69feda0d1abc: Status 404 returned error can't find the container with id b3f116ab8a7be05565afca7c8adf41982afbb1e1aac53acd7c0a69feda0d1abc Nov 29 02:57:22 crc kubenswrapper[4749]: I1129 02:57:22.506367 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 29 02:57:22 crc kubenswrapper[4749]: I1129 02:57:22.587466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6b84c17-5e5b-4464-9890-31bb49853d6d","Type":"ContainerStarted","Data":"b3f116ab8a7be05565afca7c8adf41982afbb1e1aac53acd7c0a69feda0d1abc"} Nov 29 02:57:23 crc kubenswrapper[4749]: I1129 02:57:23.093160 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef6211e-89be-4693-83c8-9b077602da01" path="/var/lib/kubelet/pods/eef6211e-89be-4693-83c8-9b077602da01/volumes" Nov 29 02:57:23 crc kubenswrapper[4749]: I1129 02:57:23.481770 4749 scope.go:117] "RemoveContainer" containerID="7eca393b6ee4afdbf4866193fffcc844b6569bc01c8a62fc888fd70f3727aa06" Nov 29 02:57:23 crc kubenswrapper[4749]: I1129 02:57:23.536600 4749 scope.go:117] "RemoveContainer" containerID="49b6b7ed109ea5fe37214b44cce0145d27b1102cadf4c143b3dcb6ef7c389331" Nov 29 02:57:23 crc kubenswrapper[4749]: I1129 02:57:23.619853 4749 scope.go:117] "RemoveContainer" containerID="c5b19be98784276127790013d8bf9be97e8dc033c7284b03f529e958d5803923" Nov 29 02:57:23 crc kubenswrapper[4749]: I1129 02:57:23.630007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6b84c17-5e5b-4464-9890-31bb49853d6d","Type":"ContainerStarted","Data":"c3dd4925a7b03f6d055f9d6deeaa730d74f4dc0d711324038bcb0d723e010372"} Nov 29 02:57:23 crc kubenswrapper[4749]: I1129 02:57:23.674058 4749 scope.go:117] "RemoveContainer" containerID="fa364c66da2b0c10a56bba19b95062b2dd2c9e8c2e6e8596701cfe10d7a238d3" Nov 29 02:57:23 crc kubenswrapper[4749]: I1129 02:57:23.715776 4749 scope.go:117] "RemoveContainer" containerID="dceaec226c6160677deef66ecbe5eb22dadd413eea3a4f96ca160a2f2d415084" Nov 29 02:57:24 crc kubenswrapper[4749]: I1129 02:57:24.652462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6b84c17-5e5b-4464-9890-31bb49853d6d","Type":"ContainerStarted","Data":"3d38a2f01a3625c48eba430e377f9d423547df3f5d70f347deca2e878912b529"} Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.373543 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.373787 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.373828 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.374925 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.374993 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" gracePeriod=600 Nov 29 02:57:25 crc kubenswrapper[4749]: E1129 02:57:25.524290 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.666434 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6b84c17-5e5b-4464-9890-31bb49853d6d","Type":"ContainerStarted","Data":"a13d5092fea57a83d8aa2e22dbc86ee3927d1f8ff259b28193259af1f55476b9"} Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.668829 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" exitCode=0 Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.668864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48"} Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.668898 4749 scope.go:117] "RemoveContainer" containerID="ab08658feaa017e23d82e44bd49750736188bccc502a12cbbe95310295445311" Nov 29 02:57:25 crc kubenswrapper[4749]: I1129 02:57:25.669549 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:57:25 crc kubenswrapper[4749]: E1129 02:57:25.669825 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:57:26 crc kubenswrapper[4749]: E1129 02:57:26.076075 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-conmon-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache]" Nov 29 02:57:27 crc kubenswrapper[4749]: I1129 02:57:27.703582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b6b84c17-5e5b-4464-9890-31bb49853d6d","Type":"ContainerStarted","Data":"b7c6f04853fdb60103f13493d02bb30985cf8f28451c8f2c9d73338e31c77f72"} Nov 29 02:57:27 crc kubenswrapper[4749]: I1129 02:57:27.704143 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 29 02:57:28 crc kubenswrapper[4749]: I1129 02:57:28.611741 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 29 02:57:28 crc kubenswrapper[4749]: I1129 02:57:28.650463 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.693353836 podStartE2EDuration="7.650441158s" podCreationTimestamp="2025-11-29 02:57:21 +0000 UTC" firstStartedPulling="2025-11-29 02:57:22.497753873 +0000 UTC m=+6385.669903730" lastFinishedPulling="2025-11-29 02:57:26.454841165 +0000 UTC m=+6389.626991052" observedRunningTime="2025-11-29 02:57:27.737841431 +0000 UTC m=+6390.909991308" watchObservedRunningTime="2025-11-29 02:57:28.650441158 +0000 UTC m=+6391.822591025" Nov 29 02:57:28 crc kubenswrapper[4749]: I1129 02:57:28.735239 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 29 02:57:28 crc kubenswrapper[4749]: I1129 02:57:28.816151 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Nov 29 02:57:36 crc kubenswrapper[4749]: E1129 02:57:36.405124 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-conmon-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache]" Nov 29 02:57:37 crc kubenswrapper[4749]: I1129 02:57:37.088155 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:57:37 crc kubenswrapper[4749]: E1129 02:57:37.089089 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:57:46 crc kubenswrapper[4749]: E1129 02:57:46.719713 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-conmon-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache]" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.063194 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b1ac-account-create-update-rg4th"] Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.076704 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-k877z"] Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.087683 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b1ac-account-create-update-rg4th"] Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.097885 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-k877z"] Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.448689 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vdzpj"] Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.453596 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.490402 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdzpj"] Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.601684 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-utilities\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.601922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7jpr\" (UniqueName: \"kubernetes.io/projected/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-kube-api-access-k7jpr\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.601980 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-catalog-content\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.704509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7jpr\" (UniqueName: \"kubernetes.io/projected/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-kube-api-access-k7jpr\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.704611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-catalog-content\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.704754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-utilities\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.705134 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-catalog-content\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.705162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-utilities\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.728934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7jpr\" (UniqueName: \"kubernetes.io/projected/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-kube-api-access-k7jpr\") pod \"redhat-marketplace-vdzpj\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:48 crc kubenswrapper[4749]: I1129 02:57:48.786365 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:49 crc kubenswrapper[4749]: I1129 02:57:49.089110 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e39d797-8cc7-40f1-8929-2ab733b4da0b" path="/var/lib/kubelet/pods/2e39d797-8cc7-40f1-8929-2ab733b4da0b/volumes" Nov 29 02:57:49 crc kubenswrapper[4749]: I1129 02:57:49.090094 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f351a6-2f7a-48f1-9b19-1c73da851367" path="/var/lib/kubelet/pods/e9f351a6-2f7a-48f1-9b19-1c73da851367/volumes" Nov 29 02:57:49 crc kubenswrapper[4749]: I1129 02:57:49.303484 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdzpj"] Nov 29 02:57:49 crc kubenswrapper[4749]: I1129 02:57:49.994475 4749 generic.go:334] "Generic (PLEG): container finished" podID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerID="fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09" exitCode=0 Nov 29 02:57:49 crc kubenswrapper[4749]: I1129 02:57:49.994590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdzpj" event={"ID":"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f","Type":"ContainerDied","Data":"fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09"} Nov 29 02:57:49 crc kubenswrapper[4749]: I1129 02:57:49.994962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdzpj" event={"ID":"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f","Type":"ContainerStarted","Data":"0fdd1d7458df8ca7631c9d88a773546ff1efbca85748f5024d22e4151c7caf3d"} Nov 29 02:57:51 crc kubenswrapper[4749]: I1129 02:57:51.007682 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdzpj" event={"ID":"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f","Type":"ContainerStarted","Data":"95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674"} Nov 29 02:57:51 crc kubenswrapper[4749]: I1129 02:57:51.076168 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:57:51 crc kubenswrapper[4749]: E1129 02:57:51.076448 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:57:52 crc kubenswrapper[4749]: I1129 02:57:52.005523 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 29 02:57:52 crc kubenswrapper[4749]: I1129 02:57:52.016260 4749 generic.go:334] "Generic (PLEG): container finished" podID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerID="95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674" exitCode=0 Nov 29 02:57:52 crc kubenswrapper[4749]: I1129 02:57:52.016294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdzpj" event={"ID":"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f","Type":"ContainerDied","Data":"95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674"} Nov 29 02:57:53 crc kubenswrapper[4749]: I1129 02:57:53.031370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdzpj" event={"ID":"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f","Type":"ContainerStarted","Data":"93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8"} Nov 29 02:57:53 crc kubenswrapper[4749]: I1129 02:57:53.056440 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vdzpj" podStartSLOduration=2.58712134 podStartE2EDuration="5.056411215s" podCreationTimestamp="2025-11-29 02:57:48 +0000 UTC" firstStartedPulling="2025-11-29 02:57:49.99703925 +0000 UTC m=+6413.169189107" lastFinishedPulling="2025-11-29 02:57:52.466329085 +0000 UTC m=+6415.638478982" observedRunningTime="2025-11-29 02:57:53.050169164 +0000 UTC m=+6416.222319051" watchObservedRunningTime="2025-11-29 02:57:53.056411215 +0000 UTC m=+6416.228561132" Nov 29 02:57:56 crc kubenswrapper[4749]: I1129 02:57:56.043871 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-brvj2"] Nov 29 02:57:56 crc kubenswrapper[4749]: I1129 02:57:56.054808 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-brvj2"] Nov 29 02:57:56 crc kubenswrapper[4749]: E1129 02:57:56.988154 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-conmon-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9006334e_6cca_4531_95a8_6bc7b2fa7b4d.slice/crio-57979142461a757faa5ad6c49a70b339846afe00f7096f36ddb671b406abf6ef.scope\": RecentStats: unable to find data in memory cache]" Nov 29 02:57:57 crc kubenswrapper[4749]: I1129 02:57:57.112153 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c3f510-09bd-45e0-afe6-9f4c3753d311" path="/var/lib/kubelet/pods/d7c3f510-09bd-45e0-afe6-9f4c3753d311/volumes" Nov 29 02:57:58 crc kubenswrapper[4749]: I1129 02:57:58.786513 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:58 crc kubenswrapper[4749]: I1129 02:57:58.786872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:58 crc kubenswrapper[4749]: I1129 02:57:58.852064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:59 crc kubenswrapper[4749]: I1129 02:57:59.206701 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:57:59 crc kubenswrapper[4749]: I1129 02:57:59.269712 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdzpj"] Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.145622 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vdzpj" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerName="registry-server" containerID="cri-o://93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8" gracePeriod=2 Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.720008 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.877085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-utilities\") pod \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.877170 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-catalog-content\") pod \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.877241 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7jpr\" (UniqueName: \"kubernetes.io/projected/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-kube-api-access-k7jpr\") pod \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\" (UID: \"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f\") " Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.879810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-utilities" (OuterVolumeSpecName: "utilities") pod "0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" (UID: "0680c2e8-959d-4b13-9c3e-985c5c1a6b2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.891465 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-kube-api-access-k7jpr" (OuterVolumeSpecName: "kube-api-access-k7jpr") pod "0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" (UID: "0680c2e8-959d-4b13-9c3e-985c5c1a6b2f"). InnerVolumeSpecName "kube-api-access-k7jpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.914119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" (UID: "0680c2e8-959d-4b13-9c3e-985c5c1a6b2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.980154 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.980215 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7jpr\" (UniqueName: \"kubernetes.io/projected/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-kube-api-access-k7jpr\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:01 crc kubenswrapper[4749]: I1129 02:58:01.980230 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.173326 4749 generic.go:334] "Generic (PLEG): container finished" podID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerID="93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8" exitCode=0 Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.173381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdzpj" event={"ID":"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f","Type":"ContainerDied","Data":"93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8"} Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.173412 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdzpj" event={"ID":"0680c2e8-959d-4b13-9c3e-985c5c1a6b2f","Type":"ContainerDied","Data":"0fdd1d7458df8ca7631c9d88a773546ff1efbca85748f5024d22e4151c7caf3d"} Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.173422 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdzpj" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.173433 4749 scope.go:117] "RemoveContainer" containerID="93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.199442 4749 scope.go:117] "RemoveContainer" containerID="95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.227112 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdzpj"] Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.245750 4749 scope.go:117] "RemoveContainer" containerID="fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.308094 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdzpj"] Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.368382 4749 scope.go:117] "RemoveContainer" containerID="93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8" Nov 29 02:58:02 crc kubenswrapper[4749]: E1129 02:58:02.375337 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8\": container with ID starting with 93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8 not found: ID does not exist" containerID="93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.375393 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8"} err="failed to get container status \"93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8\": rpc error: code = NotFound desc = could not find container \"93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8\": container with ID starting with 93e75d0159a8c6f001b22d04590b906c5bb469de6fb4c2d9c0e3b26003f9f4a8 not found: ID does not exist" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.375427 4749 scope.go:117] "RemoveContainer" containerID="95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674" Nov 29 02:58:02 crc kubenswrapper[4749]: E1129 02:58:02.376715 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674\": container with ID starting with 95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674 not found: ID does not exist" containerID="95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.376755 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674"} err="failed to get container status \"95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674\": rpc error: code = NotFound desc = could not find container \"95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674\": container with ID starting with 95a988a7edfb3857cdffcea72315ffc50a5dbdf27b89428fcdef949b8c63d674 not found: ID does not exist" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.376779 4749 scope.go:117] "RemoveContainer" containerID="fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09" Nov 29 02:58:02 crc kubenswrapper[4749]: E1129 02:58:02.378592 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09\": container with ID starting with fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09 not found: ID does not exist" containerID="fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09" Nov 29 02:58:02 crc kubenswrapper[4749]: I1129 02:58:02.378611 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09"} err="failed to get container status \"fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09\": rpc error: code = NotFound desc = could not find container \"fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09\": container with ID starting with fd39087ba5c3dec7ec75453cff1e396f9ea4552850d9529bcdca3ba038789b09 not found: ID does not exist" Nov 29 02:58:03 crc kubenswrapper[4749]: I1129 02:58:03.092729 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" path="/var/lib/kubelet/pods/0680c2e8-959d-4b13-9c3e-985c5c1a6b2f/volumes" Nov 29 02:58:04 crc kubenswrapper[4749]: I1129 02:58:04.075672 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:58:04 crc kubenswrapper[4749]: E1129 02:58:04.076093 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.061004 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd6cd9bb7-frt4j"] Nov 29 02:58:14 crc kubenswrapper[4749]: E1129 02:58:14.062824 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerName="registry-server" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.062860 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerName="registry-server" Nov 29 02:58:14 crc kubenswrapper[4749]: E1129 02:58:14.062875 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerName="extract-content" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.062881 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerName="extract-content" Nov 29 02:58:14 crc kubenswrapper[4749]: E1129 02:58:14.062903 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerName="extract-utilities" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.062909 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerName="extract-utilities" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.063126 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0680c2e8-959d-4b13-9c3e-985c5c1a6b2f" containerName="registry-server" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.064311 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.066563 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.073394 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd6cd9bb7-frt4j"] Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.119918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.120002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.120022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-openstack-cell1\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.120064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-config\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.120313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9x2\" (UniqueName: \"kubernetes.io/projected/d6808600-a967-4054-91c4-79728ec398d0-kube-api-access-lq9x2\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.120538 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-dns-svc\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.222218 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-config\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.222389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9x2\" (UniqueName: \"kubernetes.io/projected/d6808600-a967-4054-91c4-79728ec398d0-kube-api-access-lq9x2\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.222499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-dns-svc\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.223095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-config\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.223408 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-dns-svc\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.223566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.224410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.224549 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-openstack-cell1\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.224573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.225174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-openstack-cell1\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.225390 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.245801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9x2\" (UniqueName: \"kubernetes.io/projected/d6808600-a967-4054-91c4-79728ec398d0-kube-api-access-lq9x2\") pod \"dnsmasq-dns-6cd6cd9bb7-frt4j\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:14 crc kubenswrapper[4749]: I1129 02:58:14.389876 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:15 crc kubenswrapper[4749]: I1129 02:58:15.108689 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd6cd9bb7-frt4j"] Nov 29 02:58:15 crc kubenswrapper[4749]: I1129 02:58:15.342782 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" event={"ID":"d6808600-a967-4054-91c4-79728ec398d0","Type":"ContainerStarted","Data":"f291d3b373ac2e0158dceda007abaaf7e448408e36e30cc94a4819bb072cff56"} Nov 29 02:58:16 crc kubenswrapper[4749]: I1129 02:58:16.369771 4749 generic.go:334] "Generic (PLEG): container finished" podID="d6808600-a967-4054-91c4-79728ec398d0" containerID="3d2f384e6d9ff63b2b23bdb4c6a46307b17f836610fa147009fef1604c5e24c6" exitCode=0 Nov 29 02:58:16 crc kubenswrapper[4749]: I1129 02:58:16.370032 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" event={"ID":"d6808600-a967-4054-91c4-79728ec398d0","Type":"ContainerDied","Data":"3d2f384e6d9ff63b2b23bdb4c6a46307b17f836610fa147009fef1604c5e24c6"} Nov 29 02:58:17 crc kubenswrapper[4749]: I1129 02:58:17.409582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" event={"ID":"d6808600-a967-4054-91c4-79728ec398d0","Type":"ContainerStarted","Data":"e8ba4bc096ddbd89e2d93fe07f202c42a2b84d14354196131e16f7dd3b08b1f1"} Nov 29 02:58:17 crc kubenswrapper[4749]: I1129 02:58:17.413779 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:17 crc kubenswrapper[4749]: I1129 02:58:17.460185 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" podStartSLOduration=3.460164599 podStartE2EDuration="3.460164599s" podCreationTimestamp="2025-11-29 02:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:58:17.446757694 +0000 UTC m=+6440.618907551" watchObservedRunningTime="2025-11-29 02:58:17.460164599 +0000 UTC m=+6440.632314466" Nov 29 02:58:19 crc kubenswrapper[4749]: I1129 02:58:19.075759 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:58:19 crc kubenswrapper[4749]: E1129 02:58:19.076317 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:58:23 crc kubenswrapper[4749]: I1129 02:58:23.946549 4749 scope.go:117] "RemoveContainer" containerID="f6b1f13821cb38acb978d353217b93cb261b1eb3f56ab50d72b169d81ee37190" Nov 29 02:58:23 crc kubenswrapper[4749]: I1129 02:58:23.982590 4749 scope.go:117] "RemoveContainer" containerID="d05157f45661b784be5a49101ed8c0aae65a5da610802f1aaf74b12a0c327ebf" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.039775 4749 scope.go:117] "RemoveContainer" containerID="d69572105badca04be2140e2b31841c6d87c36fb1b80f0e79f541dd20d59aa7c" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.392003 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.548770 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546989675f-phc7t"] Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.549147 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-546989675f-phc7t" podUID="c131e997-2d41-4b8b-b10b-07d028170709" containerName="dnsmasq-dns" containerID="cri-o://1b756fe6e412e4b37d27dfca6f93ba2390f35d15ef4237393519ba499966ee0a" gracePeriod=10 Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.835134 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-697db9564f-g8dnz"] Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.836880 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.849423 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697db9564f-g8dnz"] Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.892661 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slsbm\" (UniqueName: \"kubernetes.io/projected/6f7ce334-1fd4-4745-b003-8291d6592f93-kube-api-access-slsbm\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.892703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-dns-svc\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.892752 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-config\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.892813 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-ovsdbserver-nb\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.892831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-ovsdbserver-sb\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.892875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-openstack-cell1\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.994472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slsbm\" (UniqueName: \"kubernetes.io/projected/6f7ce334-1fd4-4745-b003-8291d6592f93-kube-api-access-slsbm\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.994522 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-dns-svc\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.994581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-config\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.994652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-ovsdbserver-nb\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.994670 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-ovsdbserver-sb\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.994717 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-openstack-cell1\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.995596 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-config\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.995678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-openstack-cell1\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.995984 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-ovsdbserver-nb\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.996321 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-dns-svc\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:24 crc kubenswrapper[4749]: I1129 02:58:24.996703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7ce334-1fd4-4745-b003-8291d6592f93-ovsdbserver-sb\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.015620 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slsbm\" (UniqueName: \"kubernetes.io/projected/6f7ce334-1fd4-4745-b003-8291d6592f93-kube-api-access-slsbm\") pod \"dnsmasq-dns-697db9564f-g8dnz\" (UID: \"6f7ce334-1fd4-4745-b003-8291d6592f93\") " pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.196808 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.523732 4749 generic.go:334] "Generic (PLEG): container finished" podID="c131e997-2d41-4b8b-b10b-07d028170709" containerID="1b756fe6e412e4b37d27dfca6f93ba2390f35d15ef4237393519ba499966ee0a" exitCode=0 Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.523816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546989675f-phc7t" event={"ID":"c131e997-2d41-4b8b-b10b-07d028170709","Type":"ContainerDied","Data":"1b756fe6e412e4b37d27dfca6f93ba2390f35d15ef4237393519ba499966ee0a"} Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.699585 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.720049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-config\") pod \"c131e997-2d41-4b8b-b10b-07d028170709\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.720114 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-sb\") pod \"c131e997-2d41-4b8b-b10b-07d028170709\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.720164 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-nb\") pod \"c131e997-2d41-4b8b-b10b-07d028170709\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.720478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z88q\" (UniqueName: \"kubernetes.io/projected/c131e997-2d41-4b8b-b10b-07d028170709-kube-api-access-9z88q\") pod \"c131e997-2d41-4b8b-b10b-07d028170709\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.720570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-dns-svc\") pod \"c131e997-2d41-4b8b-b10b-07d028170709\" (UID: \"c131e997-2d41-4b8b-b10b-07d028170709\") " Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.735858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c131e997-2d41-4b8b-b10b-07d028170709-kube-api-access-9z88q" (OuterVolumeSpecName: "kube-api-access-9z88q") pod "c131e997-2d41-4b8b-b10b-07d028170709" (UID: "c131e997-2d41-4b8b-b10b-07d028170709"). InnerVolumeSpecName "kube-api-access-9z88q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.782096 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697db9564f-g8dnz"] Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.786693 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-config" (OuterVolumeSpecName: "config") pod "c131e997-2d41-4b8b-b10b-07d028170709" (UID: "c131e997-2d41-4b8b-b10b-07d028170709"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.801945 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c131e997-2d41-4b8b-b10b-07d028170709" (UID: "c131e997-2d41-4b8b-b10b-07d028170709"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.821025 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c131e997-2d41-4b8b-b10b-07d028170709" (UID: "c131e997-2d41-4b8b-b10b-07d028170709"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.824191 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.824238 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.824254 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.824267 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z88q\" (UniqueName: \"kubernetes.io/projected/c131e997-2d41-4b8b-b10b-07d028170709-kube-api-access-9z88q\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.826935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c131e997-2d41-4b8b-b10b-07d028170709" (UID: "c131e997-2d41-4b8b-b10b-07d028170709"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:25 crc kubenswrapper[4749]: I1129 02:58:25.926120 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c131e997-2d41-4b8b-b10b-07d028170709-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.538225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546989675f-phc7t" event={"ID":"c131e997-2d41-4b8b-b10b-07d028170709","Type":"ContainerDied","Data":"67250c1cd7f41e5622f3fa5cd544525017153346af742440fb47f330a6dea4d5"} Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.538489 4749 scope.go:117] "RemoveContainer" containerID="1b756fe6e412e4b37d27dfca6f93ba2390f35d15ef4237393519ba499966ee0a" Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.538250 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546989675f-phc7t" Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.540504 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f7ce334-1fd4-4745-b003-8291d6592f93" containerID="114ebba42b94814939095bfd5e9e01bc0e83d34c7f25bbefef1c26114b6ed450" exitCode=0 Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.540543 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697db9564f-g8dnz" event={"ID":"6f7ce334-1fd4-4745-b003-8291d6592f93","Type":"ContainerDied","Data":"114ebba42b94814939095bfd5e9e01bc0e83d34c7f25bbefef1c26114b6ed450"} Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.540575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697db9564f-g8dnz" event={"ID":"6f7ce334-1fd4-4745-b003-8291d6592f93","Type":"ContainerStarted","Data":"80388dc8f1aa83b7e00dd29784bdb0d63c1e1b2255aba3da760f0078669df107"} Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.574876 4749 scope.go:117] "RemoveContainer" containerID="7c26699b16d299fc549fc3bc48472a7dd5faa53930cc7c653970f2c2869f89e5" Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.827012 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546989675f-phc7t"] Nov 29 02:58:26 crc kubenswrapper[4749]: I1129 02:58:26.840780 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-546989675f-phc7t"] Nov 29 02:58:27 crc kubenswrapper[4749]: I1129 02:58:27.097531 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c131e997-2d41-4b8b-b10b-07d028170709" path="/var/lib/kubelet/pods/c131e997-2d41-4b8b-b10b-07d028170709/volumes" Nov 29 02:58:27 crc kubenswrapper[4749]: I1129 02:58:27.551476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697db9564f-g8dnz" event={"ID":"6f7ce334-1fd4-4745-b003-8291d6592f93","Type":"ContainerStarted","Data":"215f8ad367fb9cf889bf6555cbfce5ee5eee43c9a9c9e191ab5a45d49956078f"} Nov 29 02:58:27 crc kubenswrapper[4749]: I1129 02:58:27.551568 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:27 crc kubenswrapper[4749]: I1129 02:58:27.576951 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-697db9564f-g8dnz" podStartSLOduration=3.576932184 podStartE2EDuration="3.576932184s" podCreationTimestamp="2025-11-29 02:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 02:58:27.567447804 +0000 UTC m=+6450.739597711" watchObservedRunningTime="2025-11-29 02:58:27.576932184 +0000 UTC m=+6450.749082041" Nov 29 02:58:31 crc kubenswrapper[4749]: I1129 02:58:31.075252 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:58:31 crc kubenswrapper[4749]: E1129 02:58:31.090489 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.199166 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-697db9564f-g8dnz" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.283582 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd6cd9bb7-frt4j"] Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.283850 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" podUID="d6808600-a967-4054-91c4-79728ec398d0" containerName="dnsmasq-dns" containerID="cri-o://e8ba4bc096ddbd89e2d93fe07f202c42a2b84d14354196131e16f7dd3b08b1f1" gracePeriod=10 Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.662237 4749 generic.go:334] "Generic (PLEG): container finished" podID="d6808600-a967-4054-91c4-79728ec398d0" containerID="e8ba4bc096ddbd89e2d93fe07f202c42a2b84d14354196131e16f7dd3b08b1f1" exitCode=0 Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.662319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" event={"ID":"d6808600-a967-4054-91c4-79728ec398d0","Type":"ContainerDied","Data":"e8ba4bc096ddbd89e2d93fe07f202c42a2b84d14354196131e16f7dd3b08b1f1"} Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.835558 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.853496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-dns-svc\") pod \"d6808600-a967-4054-91c4-79728ec398d0\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.853598 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-config\") pod \"d6808600-a967-4054-91c4-79728ec398d0\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.853725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq9x2\" (UniqueName: \"kubernetes.io/projected/d6808600-a967-4054-91c4-79728ec398d0-kube-api-access-lq9x2\") pod \"d6808600-a967-4054-91c4-79728ec398d0\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.853789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-openstack-cell1\") pod \"d6808600-a967-4054-91c4-79728ec398d0\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.853808 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-sb\") pod \"d6808600-a967-4054-91c4-79728ec398d0\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.853891 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-nb\") pod \"d6808600-a967-4054-91c4-79728ec398d0\" (UID: \"d6808600-a967-4054-91c4-79728ec398d0\") " Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.868753 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6808600-a967-4054-91c4-79728ec398d0-kube-api-access-lq9x2" (OuterVolumeSpecName: "kube-api-access-lq9x2") pod "d6808600-a967-4054-91c4-79728ec398d0" (UID: "d6808600-a967-4054-91c4-79728ec398d0"). InnerVolumeSpecName "kube-api-access-lq9x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.917017 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d6808600-a967-4054-91c4-79728ec398d0" (UID: "d6808600-a967-4054-91c4-79728ec398d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.931133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "d6808600-a967-4054-91c4-79728ec398d0" (UID: "d6808600-a967-4054-91c4-79728ec398d0"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.936507 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6808600-a967-4054-91c4-79728ec398d0" (UID: "d6808600-a967-4054-91c4-79728ec398d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.939908 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d6808600-a967-4054-91c4-79728ec398d0" (UID: "d6808600-a967-4054-91c4-79728ec398d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.943425 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-config" (OuterVolumeSpecName: "config") pod "d6808600-a967-4054-91c4-79728ec398d0" (UID: "d6808600-a967-4054-91c4-79728ec398d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.955729 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.955759 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.955773 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-config\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.955786 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq9x2\" (UniqueName: \"kubernetes.io/projected/d6808600-a967-4054-91c4-79728ec398d0-kube-api-access-lq9x2\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.955797 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-openstack-cell1\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:35 crc kubenswrapper[4749]: I1129 02:58:35.955808 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6808600-a967-4054-91c4-79728ec398d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 29 02:58:36 crc kubenswrapper[4749]: I1129 02:58:36.681493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" event={"ID":"d6808600-a967-4054-91c4-79728ec398d0","Type":"ContainerDied","Data":"f291d3b373ac2e0158dceda007abaaf7e448408e36e30cc94a4819bb072cff56"} Nov 29 02:58:36 crc kubenswrapper[4749]: I1129 02:58:36.681891 4749 scope.go:117] "RemoveContainer" containerID="e8ba4bc096ddbd89e2d93fe07f202c42a2b84d14354196131e16f7dd3b08b1f1" Nov 29 02:58:36 crc kubenswrapper[4749]: I1129 02:58:36.681616 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd6cd9bb7-frt4j" Nov 29 02:58:36 crc kubenswrapper[4749]: I1129 02:58:36.714862 4749 scope.go:117] "RemoveContainer" containerID="3d2f384e6d9ff63b2b23bdb4c6a46307b17f836610fa147009fef1604c5e24c6" Nov 29 02:58:36 crc kubenswrapper[4749]: I1129 02:58:36.742404 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd6cd9bb7-frt4j"] Nov 29 02:58:36 crc kubenswrapper[4749]: I1129 02:58:36.753867 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cd6cd9bb7-frt4j"] Nov 29 02:58:37 crc kubenswrapper[4749]: I1129 02:58:37.093928 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6808600-a967-4054-91c4-79728ec398d0" path="/var/lib/kubelet/pods/d6808600-a967-4054-91c4-79728ec398d0/volumes" Nov 29 02:58:43 crc kubenswrapper[4749]: I1129 02:58:43.076317 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:58:43 crc kubenswrapper[4749]: E1129 02:58:43.077722 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.307295 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs"] Nov 29 02:58:46 crc kubenswrapper[4749]: E1129 02:58:46.308381 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6808600-a967-4054-91c4-79728ec398d0" containerName="dnsmasq-dns" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.308399 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6808600-a967-4054-91c4-79728ec398d0" containerName="dnsmasq-dns" Nov 29 02:58:46 crc kubenswrapper[4749]: E1129 02:58:46.308426 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c131e997-2d41-4b8b-b10b-07d028170709" containerName="dnsmasq-dns" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.308433 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c131e997-2d41-4b8b-b10b-07d028170709" containerName="dnsmasq-dns" Nov 29 02:58:46 crc kubenswrapper[4749]: E1129 02:58:46.308457 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6808600-a967-4054-91c4-79728ec398d0" containerName="init" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.308465 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6808600-a967-4054-91c4-79728ec398d0" containerName="init" Nov 29 02:58:46 crc kubenswrapper[4749]: E1129 02:58:46.308485 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c131e997-2d41-4b8b-b10b-07d028170709" containerName="init" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.308493 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c131e997-2d41-4b8b-b10b-07d028170709" containerName="init" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.308744 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6808600-a967-4054-91c4-79728ec398d0" containerName="dnsmasq-dns" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.308779 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c131e997-2d41-4b8b-b10b-07d028170709" containerName="dnsmasq-dns" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.309758 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.313963 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.316995 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.317629 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.331463 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.332137 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs"] Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.446349 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.446492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.446524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.446607 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svf2k\" (UniqueName: \"kubernetes.io/projected/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-kube-api-access-svf2k\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.446633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.548795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.549025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.549103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.549329 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svf2k\" (UniqueName: \"kubernetes.io/projected/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-kube-api-access-svf2k\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.549871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.556053 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.556572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.557180 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.560745 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.575574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svf2k\" (UniqueName: \"kubernetes.io/projected/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-kube-api-access-svf2k\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:46 crc kubenswrapper[4749]: I1129 02:58:46.662548 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:58:47 crc kubenswrapper[4749]: I1129 02:58:47.293318 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs"] Nov 29 02:58:47 crc kubenswrapper[4749]: W1129 02:58:47.301767 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dfc44dd_ef6d_44d9_9bcf_e1a28cdb71d6.slice/crio-4a765f0a1a313ad3a6fbe8f2467c179e30b70417de004b8e0cdcc3505db765e4 WatchSource:0}: Error finding container 4a765f0a1a313ad3a6fbe8f2467c179e30b70417de004b8e0cdcc3505db765e4: Status 404 returned error can't find the container with id 4a765f0a1a313ad3a6fbe8f2467c179e30b70417de004b8e0cdcc3505db765e4 Nov 29 02:58:47 crc kubenswrapper[4749]: I1129 02:58:47.831257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" event={"ID":"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6","Type":"ContainerStarted","Data":"4a765f0a1a313ad3a6fbe8f2467c179e30b70417de004b8e0cdcc3505db765e4"} Nov 29 02:58:56 crc kubenswrapper[4749]: I1129 02:58:56.075855 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:58:56 crc kubenswrapper[4749]: E1129 02:58:56.077320 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:58:58 crc kubenswrapper[4749]: I1129 02:58:58.657483 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 02:58:59 crc kubenswrapper[4749]: I1129 02:58:59.988778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" event={"ID":"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6","Type":"ContainerStarted","Data":"b36e8f54231018594ac2763ddd71113af57c0b7269128840cbfb49ae6b19d535"} Nov 29 02:59:00 crc kubenswrapper[4749]: I1129 02:59:00.013242 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" podStartSLOduration=2.6642899570000003 podStartE2EDuration="14.013221063s" podCreationTimestamp="2025-11-29 02:58:46 +0000 UTC" firstStartedPulling="2025-11-29 02:58:47.304417015 +0000 UTC m=+6470.476566872" lastFinishedPulling="2025-11-29 02:58:58.653348121 +0000 UTC m=+6481.825497978" observedRunningTime="2025-11-29 02:59:00.007610157 +0000 UTC m=+6483.179760024" watchObservedRunningTime="2025-11-29 02:59:00.013221063 +0000 UTC m=+6483.185370930" Nov 29 02:59:08 crc kubenswrapper[4749]: I1129 02:59:08.075989 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:59:08 crc kubenswrapper[4749]: E1129 02:59:08.077172 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:59:13 crc kubenswrapper[4749]: I1129 02:59:13.141490 4749 generic.go:334] "Generic (PLEG): container finished" podID="1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" containerID="b36e8f54231018594ac2763ddd71113af57c0b7269128840cbfb49ae6b19d535" exitCode=0 Nov 29 02:59:13 crc kubenswrapper[4749]: I1129 02:59:13.141590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" event={"ID":"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6","Type":"ContainerDied","Data":"b36e8f54231018594ac2763ddd71113af57c0b7269128840cbfb49ae6b19d535"} Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.665838 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.830247 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svf2k\" (UniqueName: \"kubernetes.io/projected/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-kube-api-access-svf2k\") pod \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.830619 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-pre-adoption-validation-combined-ca-bundle\") pod \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.830784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ceph\") pod \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.830911 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-inventory\") pod \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.831319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ssh-key\") pod \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\" (UID: \"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6\") " Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.835888 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ceph" (OuterVolumeSpecName: "ceph") pod "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" (UID: "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.835924 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" (UID: "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.851078 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-kube-api-access-svf2k" (OuterVolumeSpecName: "kube-api-access-svf2k") pod "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" (UID: "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6"). InnerVolumeSpecName "kube-api-access-svf2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.858107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-inventory" (OuterVolumeSpecName: "inventory") pod "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" (UID: "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.876359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" (UID: "1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.934895 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.935437 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.935453 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.935465 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svf2k\" (UniqueName: \"kubernetes.io/projected/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-kube-api-access-svf2k\") on node \"crc\" DevicePath \"\"" Nov 29 02:59:14 crc kubenswrapper[4749]: I1129 02:59:14.935478 4749 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 02:59:15 crc kubenswrapper[4749]: I1129 02:59:15.176687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" event={"ID":"1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6","Type":"ContainerDied","Data":"4a765f0a1a313ad3a6fbe8f2467c179e30b70417de004b8e0cdcc3505db765e4"} Nov 29 02:59:15 crc kubenswrapper[4749]: I1129 02:59:15.176758 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a765f0a1a313ad3a6fbe8f2467c179e30b70417de004b8e0cdcc3505db765e4" Nov 29 02:59:15 crc kubenswrapper[4749]: I1129 02:59:15.176808 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.484120 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5"] Nov 29 02:59:19 crc kubenswrapper[4749]: E1129 02:59:19.494483 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.494516 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.495177 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.496469 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.501067 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.501285 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.501337 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.501882 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.529329 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5"] Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.663695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.663756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.664472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w55np\" (UniqueName: \"kubernetes.io/projected/234ff02d-f844-46c2-9a13-9cc6ce370926-kube-api-access-w55np\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.664517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.664728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.766411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.766450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w55np\" (UniqueName: \"kubernetes.io/projected/234ff02d-f844-46c2-9a13-9cc6ce370926-kube-api-access-w55np\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.766498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.766565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.766621 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.772566 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.772706 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.772773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.773650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.790929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w55np\" (UniqueName: \"kubernetes.io/projected/234ff02d-f844-46c2-9a13-9cc6ce370926-kube-api-access-w55np\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:19 crc kubenswrapper[4749]: I1129 02:59:19.822173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 02:59:20 crc kubenswrapper[4749]: I1129 02:59:20.390656 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5"] Nov 29 02:59:21 crc kubenswrapper[4749]: I1129 02:59:21.252959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" event={"ID":"234ff02d-f844-46c2-9a13-9cc6ce370926","Type":"ContainerStarted","Data":"9226f96005dd431a7c029c8cba0a5bf4497abdc4939941c9e1d9b05ec0473a8e"} Nov 29 02:59:21 crc kubenswrapper[4749]: I1129 02:59:21.253704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" event={"ID":"234ff02d-f844-46c2-9a13-9cc6ce370926","Type":"ContainerStarted","Data":"7c5154a466864c2fa1a1679e07c9361036ad73985bacc26b0d16335bb1207339"} Nov 29 02:59:21 crc kubenswrapper[4749]: I1129 02:59:21.294439 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" podStartSLOduration=1.837821658 podStartE2EDuration="2.294420569s" podCreationTimestamp="2025-11-29 02:59:19 +0000 UTC" firstStartedPulling="2025-11-29 02:59:20.400988856 +0000 UTC m=+6503.573138713" lastFinishedPulling="2025-11-29 02:59:20.857587767 +0000 UTC m=+6504.029737624" observedRunningTime="2025-11-29 02:59:21.283870392 +0000 UTC m=+6504.456020289" watchObservedRunningTime="2025-11-29 02:59:21.294420569 +0000 UTC m=+6504.466570446" Nov 29 02:59:22 crc kubenswrapper[4749]: I1129 02:59:22.076299 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:59:22 crc kubenswrapper[4749]: E1129 02:59:22.076633 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:59:24 crc kubenswrapper[4749]: I1129 02:59:24.204528 4749 scope.go:117] "RemoveContainer" containerID="82d8499978b254cd5f0aaa57908aff4712cd2f06f50f07053572e0821e3648b4" Nov 29 02:59:24 crc kubenswrapper[4749]: I1129 02:59:24.226504 4749 scope.go:117] "RemoveContainer" containerID="b4853470ee8eed8df2843e4e7701044f08b50f51f1f8c8bab08d4bca8708176f" Nov 29 02:59:33 crc kubenswrapper[4749]: I1129 02:59:33.075548 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:59:33 crc kubenswrapper[4749]: E1129 02:59:33.076549 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 02:59:47 crc kubenswrapper[4749]: I1129 02:59:47.102501 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 02:59:47 crc kubenswrapper[4749]: E1129 02:59:47.103928 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.160345 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9"] Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.163566 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.166776 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.167032 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.176412 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9"] Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.212631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2178e763-77a7-4221-a7e4-db0afbfe42c8-config-volume\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.213154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxc2\" (UniqueName: \"kubernetes.io/projected/2178e763-77a7-4221-a7e4-db0afbfe42c8-kube-api-access-6cxc2\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.213289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2178e763-77a7-4221-a7e4-db0afbfe42c8-secret-volume\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.315838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxc2\" (UniqueName: \"kubernetes.io/projected/2178e763-77a7-4221-a7e4-db0afbfe42c8-kube-api-access-6cxc2\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.315899 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2178e763-77a7-4221-a7e4-db0afbfe42c8-secret-volume\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.315955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2178e763-77a7-4221-a7e4-db0afbfe42c8-config-volume\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.316868 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2178e763-77a7-4221-a7e4-db0afbfe42c8-config-volume\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.325765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2178e763-77a7-4221-a7e4-db0afbfe42c8-secret-volume\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.339972 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxc2\" (UniqueName: \"kubernetes.io/projected/2178e763-77a7-4221-a7e4-db0afbfe42c8-kube-api-access-6cxc2\") pod \"collect-profiles-29406420-hzsm9\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:00 crc kubenswrapper[4749]: I1129 03:00:00.523846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:01 crc kubenswrapper[4749]: I1129 03:00:01.035213 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9"] Nov 29 03:00:01 crc kubenswrapper[4749]: I1129 03:00:01.699080 4749 generic.go:334] "Generic (PLEG): container finished" podID="2178e763-77a7-4221-a7e4-db0afbfe42c8" containerID="6d1ebeaf0b5202f7c74a8f46100c2a419b905acd371abbc810cf17114c8e4600" exitCode=0 Nov 29 03:00:01 crc kubenswrapper[4749]: I1129 03:00:01.699161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" event={"ID":"2178e763-77a7-4221-a7e4-db0afbfe42c8","Type":"ContainerDied","Data":"6d1ebeaf0b5202f7c74a8f46100c2a419b905acd371abbc810cf17114c8e4600"} Nov 29 03:00:01 crc kubenswrapper[4749]: I1129 03:00:01.699687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" event={"ID":"2178e763-77a7-4221-a7e4-db0afbfe42c8","Type":"ContainerStarted","Data":"268a8bfa6236daed7b4bdffd5f038a550a739e86c142d971d961ba3867c0a0dc"} Nov 29 03:00:02 crc kubenswrapper[4749]: I1129 03:00:02.076151 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:00:02 crc kubenswrapper[4749]: E1129 03:00:02.076642 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.148390 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.178899 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cxc2\" (UniqueName: \"kubernetes.io/projected/2178e763-77a7-4221-a7e4-db0afbfe42c8-kube-api-access-6cxc2\") pod \"2178e763-77a7-4221-a7e4-db0afbfe42c8\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.179188 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2178e763-77a7-4221-a7e4-db0afbfe42c8-config-volume\") pod \"2178e763-77a7-4221-a7e4-db0afbfe42c8\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.179291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2178e763-77a7-4221-a7e4-db0afbfe42c8-secret-volume\") pod \"2178e763-77a7-4221-a7e4-db0afbfe42c8\" (UID: \"2178e763-77a7-4221-a7e4-db0afbfe42c8\") " Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.179812 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2178e763-77a7-4221-a7e4-db0afbfe42c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "2178e763-77a7-4221-a7e4-db0afbfe42c8" (UID: "2178e763-77a7-4221-a7e4-db0afbfe42c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.180120 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2178e763-77a7-4221-a7e4-db0afbfe42c8-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.188460 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2178e763-77a7-4221-a7e4-db0afbfe42c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2178e763-77a7-4221-a7e4-db0afbfe42c8" (UID: "2178e763-77a7-4221-a7e4-db0afbfe42c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.188504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2178e763-77a7-4221-a7e4-db0afbfe42c8-kube-api-access-6cxc2" (OuterVolumeSpecName: "kube-api-access-6cxc2") pod "2178e763-77a7-4221-a7e4-db0afbfe42c8" (UID: "2178e763-77a7-4221-a7e4-db0afbfe42c8"). InnerVolumeSpecName "kube-api-access-6cxc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.282378 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2178e763-77a7-4221-a7e4-db0afbfe42c8-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.282410 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cxc2\" (UniqueName: \"kubernetes.io/projected/2178e763-77a7-4221-a7e4-db0afbfe42c8-kube-api-access-6cxc2\") on node \"crc\" DevicePath \"\"" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.722287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" event={"ID":"2178e763-77a7-4221-a7e4-db0afbfe42c8","Type":"ContainerDied","Data":"268a8bfa6236daed7b4bdffd5f038a550a739e86c142d971d961ba3867c0a0dc"} Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.722636 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="268a8bfa6236daed7b4bdffd5f038a550a739e86c142d971d961ba3867c0a0dc" Nov 29 03:00:03 crc kubenswrapper[4749]: I1129 03:00:03.722371 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9" Nov 29 03:00:04 crc kubenswrapper[4749]: I1129 03:00:04.227281 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc"] Nov 29 03:00:04 crc kubenswrapper[4749]: I1129 03:00:04.236009 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406375-8f5rc"] Nov 29 03:00:05 crc kubenswrapper[4749]: I1129 03:00:05.091005 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7973bd-913f-4441-8e08-9d5ab221b673" path="/var/lib/kubelet/pods/2b7973bd-913f-4441-8e08-9d5ab221b673/volumes" Nov 29 03:00:13 crc kubenswrapper[4749]: I1129 03:00:13.075271 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:00:13 crc kubenswrapper[4749]: E1129 03:00:13.076046 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:00:24 crc kubenswrapper[4749]: I1129 03:00:24.490767 4749 scope.go:117] "RemoveContainer" containerID="2f633c8cfac15f8676af00b34b24107a3153defc94ea298f9aeb3d288bb0a27c" Nov 29 03:00:28 crc kubenswrapper[4749]: I1129 03:00:28.075310 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:00:28 crc kubenswrapper[4749]: E1129 03:00:28.075763 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:00:39 crc kubenswrapper[4749]: I1129 03:00:39.070008 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-rr527"] Nov 29 03:00:39 crc kubenswrapper[4749]: I1129 03:00:39.100028 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-rr527"] Nov 29 03:00:40 crc kubenswrapper[4749]: I1129 03:00:40.076159 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:00:40 crc kubenswrapper[4749]: E1129 03:00:40.076916 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:00:41 crc kubenswrapper[4749]: I1129 03:00:41.035525 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-cb81-account-create-update-mvmq8"] Nov 29 03:00:41 crc kubenswrapper[4749]: I1129 03:00:41.045426 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-cb81-account-create-update-mvmq8"] Nov 29 03:00:41 crc kubenswrapper[4749]: I1129 03:00:41.092645 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ddb009-1193-414f-9d3d-2e43e6eef806" path="/var/lib/kubelet/pods/04ddb009-1193-414f-9d3d-2e43e6eef806/volumes" Nov 29 03:00:41 crc kubenswrapper[4749]: I1129 03:00:41.094039 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf870051-768d-41f5-93bb-0f977eb2ae14" path="/var/lib/kubelet/pods/bf870051-768d-41f5-93bb-0f977eb2ae14/volumes" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.022734 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-tpwtc"] Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.031448 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-tpwtc"] Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.398107 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nzjgc"] Nov 29 03:00:46 crc kubenswrapper[4749]: E1129 03:00:46.402755 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2178e763-77a7-4221-a7e4-db0afbfe42c8" containerName="collect-profiles" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.402792 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2178e763-77a7-4221-a7e4-db0afbfe42c8" containerName="collect-profiles" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.403016 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2178e763-77a7-4221-a7e4-db0afbfe42c8" containerName="collect-profiles" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.404629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.412440 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzjgc"] Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.562650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-catalog-content\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.562700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-utilities\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.562754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mgln\" (UniqueName: \"kubernetes.io/projected/f138b353-d1d5-4706-af92-e1ad0db0196d-kube-api-access-9mgln\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.666409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-catalog-content\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.666520 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-utilities\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.666597 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mgln\" (UniqueName: \"kubernetes.io/projected/f138b353-d1d5-4706-af92-e1ad0db0196d-kube-api-access-9mgln\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.667542 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-utilities\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.667562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-catalog-content\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.695940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mgln\" (UniqueName: \"kubernetes.io/projected/f138b353-d1d5-4706-af92-e1ad0db0196d-kube-api-access-9mgln\") pod \"redhat-operators-nzjgc\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:46 crc kubenswrapper[4749]: I1129 03:00:46.743066 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:47 crc kubenswrapper[4749]: I1129 03:00:47.033685 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-53ad-account-create-update-s9jwg"] Nov 29 03:00:47 crc kubenswrapper[4749]: I1129 03:00:47.045216 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-53ad-account-create-update-s9jwg"] Nov 29 03:00:47 crc kubenswrapper[4749]: I1129 03:00:47.091521 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb8a55c-d7cf-4875-9d9e-7760ca068b8d" path="/var/lib/kubelet/pods/6eb8a55c-d7cf-4875-9d9e-7760ca068b8d/volumes" Nov 29 03:00:47 crc kubenswrapper[4749]: I1129 03:00:47.092727 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb95d3b-fab4-4a91-936f-98bfc00dc782" path="/var/lib/kubelet/pods/ddb95d3b-fab4-4a91-936f-98bfc00dc782/volumes" Nov 29 03:00:47 crc kubenswrapper[4749]: I1129 03:00:47.245618 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzjgc"] Nov 29 03:00:47 crc kubenswrapper[4749]: I1129 03:00:47.296764 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzjgc" event={"ID":"f138b353-d1d5-4706-af92-e1ad0db0196d","Type":"ContainerStarted","Data":"9e8d04b8357e4496988858c1ada71235154c4d0ead04e065be1a005659dc2a45"} Nov 29 03:00:48 crc kubenswrapper[4749]: I1129 03:00:48.311192 4749 generic.go:334] "Generic (PLEG): container finished" podID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerID="61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f" exitCode=0 Nov 29 03:00:48 crc kubenswrapper[4749]: I1129 03:00:48.311533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzjgc" event={"ID":"f138b353-d1d5-4706-af92-e1ad0db0196d","Type":"ContainerDied","Data":"61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f"} Nov 29 03:00:50 crc kubenswrapper[4749]: I1129 03:00:50.330755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzjgc" event={"ID":"f138b353-d1d5-4706-af92-e1ad0db0196d","Type":"ContainerStarted","Data":"cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384"} Nov 29 03:00:52 crc kubenswrapper[4749]: I1129 03:00:52.075349 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:00:52 crc kubenswrapper[4749]: E1129 03:00:52.076322 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:00:54 crc kubenswrapper[4749]: I1129 03:00:54.377825 4749 generic.go:334] "Generic (PLEG): container finished" podID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerID="cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384" exitCode=0 Nov 29 03:00:54 crc kubenswrapper[4749]: I1129 03:00:54.377999 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzjgc" event={"ID":"f138b353-d1d5-4706-af92-e1ad0db0196d","Type":"ContainerDied","Data":"cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384"} Nov 29 03:00:56 crc kubenswrapper[4749]: I1129 03:00:56.406661 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzjgc" event={"ID":"f138b353-d1d5-4706-af92-e1ad0db0196d","Type":"ContainerStarted","Data":"ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4"} Nov 29 03:00:56 crc kubenswrapper[4749]: I1129 03:00:56.435607 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nzjgc" podStartSLOduration=3.482336703 podStartE2EDuration="10.435590066s" podCreationTimestamp="2025-11-29 03:00:46 +0000 UTC" firstStartedPulling="2025-11-29 03:00:48.313916478 +0000 UTC m=+6591.486066375" lastFinishedPulling="2025-11-29 03:00:55.267169881 +0000 UTC m=+6598.439319738" observedRunningTime="2025-11-29 03:00:56.434757226 +0000 UTC m=+6599.606907083" watchObservedRunningTime="2025-11-29 03:00:56.435590066 +0000 UTC m=+6599.607739923" Nov 29 03:00:56 crc kubenswrapper[4749]: I1129 03:00:56.743590 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:56 crc kubenswrapper[4749]: I1129 03:00:56.743655 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:00:57 crc kubenswrapper[4749]: I1129 03:00:57.842075 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nzjgc" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="registry-server" probeResult="failure" output=< Nov 29 03:00:57 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 03:00:57 crc kubenswrapper[4749]: > Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.168272 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29406421-hn8jz"] Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.171156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.179738 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29406421-hn8jz"] Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.337589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-fernet-keys\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.337791 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5bp9\" (UniqueName: \"kubernetes.io/projected/9347885d-8de5-4420-a979-c96f8e80d931-kube-api-access-r5bp9\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.337846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-combined-ca-bundle\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.338036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-config-data\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.440374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-combined-ca-bundle\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.440544 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-config-data\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.440824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-fernet-keys\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.440975 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5bp9\" (UniqueName: \"kubernetes.io/projected/9347885d-8de5-4420-a979-c96f8e80d931-kube-api-access-r5bp9\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.450571 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-combined-ca-bundle\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.450593 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-fernet-keys\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.453123 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-config-data\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.472645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5bp9\" (UniqueName: \"kubernetes.io/projected/9347885d-8de5-4420-a979-c96f8e80d931-kube-api-access-r5bp9\") pod \"keystone-cron-29406421-hn8jz\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:00 crc kubenswrapper[4749]: I1129 03:01:00.552910 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:01 crc kubenswrapper[4749]: W1129 03:01:01.055429 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9347885d_8de5_4420_a979_c96f8e80d931.slice/crio-9607c5d9b92cccf8e9b01dd9cef12c8bf01cdec2ae3469c456964049847bd91a WatchSource:0}: Error finding container 9607c5d9b92cccf8e9b01dd9cef12c8bf01cdec2ae3469c456964049847bd91a: Status 404 returned error can't find the container with id 9607c5d9b92cccf8e9b01dd9cef12c8bf01cdec2ae3469c456964049847bd91a Nov 29 03:01:01 crc kubenswrapper[4749]: I1129 03:01:01.055607 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29406421-hn8jz"] Nov 29 03:01:01 crc kubenswrapper[4749]: I1129 03:01:01.480153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406421-hn8jz" event={"ID":"9347885d-8de5-4420-a979-c96f8e80d931","Type":"ContainerStarted","Data":"b7fae707add7ffd78279e7fc8236045795a23176c9164c8b407815d212d64294"} Nov 29 03:01:01 crc kubenswrapper[4749]: I1129 03:01:01.480666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406421-hn8jz" event={"ID":"9347885d-8de5-4420-a979-c96f8e80d931","Type":"ContainerStarted","Data":"9607c5d9b92cccf8e9b01dd9cef12c8bf01cdec2ae3469c456964049847bd91a"} Nov 29 03:01:01 crc kubenswrapper[4749]: I1129 03:01:01.505855 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29406421-hn8jz" podStartSLOduration=1.505837001 podStartE2EDuration="1.505837001s" podCreationTimestamp="2025-11-29 03:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 03:01:01.502761856 +0000 UTC m=+6604.674911723" watchObservedRunningTime="2025-11-29 03:01:01.505837001 +0000 UTC m=+6604.677986878" Nov 29 03:01:03 crc kubenswrapper[4749]: I1129 03:01:03.075672 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:01:03 crc kubenswrapper[4749]: E1129 03:01:03.076590 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:01:03 crc kubenswrapper[4749]: I1129 03:01:03.511237 4749 generic.go:334] "Generic (PLEG): container finished" podID="9347885d-8de5-4420-a979-c96f8e80d931" containerID="b7fae707add7ffd78279e7fc8236045795a23176c9164c8b407815d212d64294" exitCode=0 Nov 29 03:01:03 crc kubenswrapper[4749]: I1129 03:01:03.511347 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406421-hn8jz" event={"ID":"9347885d-8de5-4420-a979-c96f8e80d931","Type":"ContainerDied","Data":"b7fae707add7ffd78279e7fc8236045795a23176c9164c8b407815d212d64294"} Nov 29 03:01:04 crc kubenswrapper[4749]: I1129 03:01:04.910823 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.047913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-fernet-keys\") pod \"9347885d-8de5-4420-a979-c96f8e80d931\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.048070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-config-data\") pod \"9347885d-8de5-4420-a979-c96f8e80d931\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.048174 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5bp9\" (UniqueName: \"kubernetes.io/projected/9347885d-8de5-4420-a979-c96f8e80d931-kube-api-access-r5bp9\") pod \"9347885d-8de5-4420-a979-c96f8e80d931\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.048334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-combined-ca-bundle\") pod \"9347885d-8de5-4420-a979-c96f8e80d931\" (UID: \"9347885d-8de5-4420-a979-c96f8e80d931\") " Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.053729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9347885d-8de5-4420-a979-c96f8e80d931" (UID: "9347885d-8de5-4420-a979-c96f8e80d931"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.055448 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9347885d-8de5-4420-a979-c96f8e80d931-kube-api-access-r5bp9" (OuterVolumeSpecName: "kube-api-access-r5bp9") pod "9347885d-8de5-4420-a979-c96f8e80d931" (UID: "9347885d-8de5-4420-a979-c96f8e80d931"). InnerVolumeSpecName "kube-api-access-r5bp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.097922 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9347885d-8de5-4420-a979-c96f8e80d931" (UID: "9347885d-8de5-4420-a979-c96f8e80d931"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.130184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-config-data" (OuterVolumeSpecName: "config-data") pod "9347885d-8de5-4420-a979-c96f8e80d931" (UID: "9347885d-8de5-4420-a979-c96f8e80d931"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.151482 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5bp9\" (UniqueName: \"kubernetes.io/projected/9347885d-8de5-4420-a979-c96f8e80d931-kube-api-access-r5bp9\") on node \"crc\" DevicePath \"\"" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.151752 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.151875 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.152016 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9347885d-8de5-4420-a979-c96f8e80d931-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.538393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406421-hn8jz" event={"ID":"9347885d-8de5-4420-a979-c96f8e80d931","Type":"ContainerDied","Data":"9607c5d9b92cccf8e9b01dd9cef12c8bf01cdec2ae3469c456964049847bd91a"} Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.538733 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9607c5d9b92cccf8e9b01dd9cef12c8bf01cdec2ae3469c456964049847bd91a" Nov 29 03:01:05 crc kubenswrapper[4749]: I1129 03:01:05.538451 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406421-hn8jz" Nov 29 03:01:07 crc kubenswrapper[4749]: I1129 03:01:07.877738 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nzjgc" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="registry-server" probeResult="failure" output=< Nov 29 03:01:07 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 03:01:07 crc kubenswrapper[4749]: > Nov 29 03:01:16 crc kubenswrapper[4749]: I1129 03:01:16.830925 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:01:16 crc kubenswrapper[4749]: I1129 03:01:16.925422 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:01:17 crc kubenswrapper[4749]: I1129 03:01:17.590848 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzjgc"] Nov 29 03:01:18 crc kubenswrapper[4749]: I1129 03:01:18.076315 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:01:18 crc kubenswrapper[4749]: E1129 03:01:18.076814 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:01:18 crc kubenswrapper[4749]: I1129 03:01:18.704417 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nzjgc" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="registry-server" containerID="cri-o://ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4" gracePeriod=2 Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.304031 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.426155 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-catalog-content\") pod \"f138b353-d1d5-4706-af92-e1ad0db0196d\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.426302 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-utilities\") pod \"f138b353-d1d5-4706-af92-e1ad0db0196d\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.426433 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mgln\" (UniqueName: \"kubernetes.io/projected/f138b353-d1d5-4706-af92-e1ad0db0196d-kube-api-access-9mgln\") pod \"f138b353-d1d5-4706-af92-e1ad0db0196d\" (UID: \"f138b353-d1d5-4706-af92-e1ad0db0196d\") " Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.427124 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-utilities" (OuterVolumeSpecName: "utilities") pod "f138b353-d1d5-4706-af92-e1ad0db0196d" (UID: "f138b353-d1d5-4706-af92-e1ad0db0196d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.431984 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f138b353-d1d5-4706-af92-e1ad0db0196d-kube-api-access-9mgln" (OuterVolumeSpecName: "kube-api-access-9mgln") pod "f138b353-d1d5-4706-af92-e1ad0db0196d" (UID: "f138b353-d1d5-4706-af92-e1ad0db0196d"). InnerVolumeSpecName "kube-api-access-9mgln". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.528500 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.528534 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mgln\" (UniqueName: \"kubernetes.io/projected/f138b353-d1d5-4706-af92-e1ad0db0196d-kube-api-access-9mgln\") on node \"crc\" DevicePath \"\"" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.530410 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f138b353-d1d5-4706-af92-e1ad0db0196d" (UID: "f138b353-d1d5-4706-af92-e1ad0db0196d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.630866 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f138b353-d1d5-4706-af92-e1ad0db0196d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.718042 4749 generic.go:334] "Generic (PLEG): container finished" podID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerID="ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4" exitCode=0 Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.718084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzjgc" event={"ID":"f138b353-d1d5-4706-af92-e1ad0db0196d","Type":"ContainerDied","Data":"ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4"} Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.718148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzjgc" event={"ID":"f138b353-d1d5-4706-af92-e1ad0db0196d","Type":"ContainerDied","Data":"9e8d04b8357e4496988858c1ada71235154c4d0ead04e065be1a005659dc2a45"} Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.718166 4749 scope.go:117] "RemoveContainer" containerID="ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.718101 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzjgc" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.745180 4749 scope.go:117] "RemoveContainer" containerID="cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.786218 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzjgc"] Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.790071 4749 scope.go:117] "RemoveContainer" containerID="61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.798818 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nzjgc"] Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.839238 4749 scope.go:117] "RemoveContainer" containerID="ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4" Nov 29 03:01:19 crc kubenswrapper[4749]: E1129 03:01:19.839754 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4\": container with ID starting with ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4 not found: ID does not exist" containerID="ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.839794 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4"} err="failed to get container status \"ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4\": rpc error: code = NotFound desc = could not find container \"ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4\": container with ID starting with ef40b5228c2dddad6a0b1c76935216124dc6afbafee5a85e746e3a48f8134ce4 not found: ID does not exist" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.839820 4749 scope.go:117] "RemoveContainer" containerID="cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384" Nov 29 03:01:19 crc kubenswrapper[4749]: E1129 03:01:19.840324 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384\": container with ID starting with cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384 not found: ID does not exist" containerID="cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.840356 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384"} err="failed to get container status \"cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384\": rpc error: code = NotFound desc = could not find container \"cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384\": container with ID starting with cdb11082169e4cae6836c3b705003c756100b0c06040187f3b44e30cab6c8384 not found: ID does not exist" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.840377 4749 scope.go:117] "RemoveContainer" containerID="61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f" Nov 29 03:01:19 crc kubenswrapper[4749]: E1129 03:01:19.840807 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f\": container with ID starting with 61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f not found: ID does not exist" containerID="61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f" Nov 29 03:01:19 crc kubenswrapper[4749]: I1129 03:01:19.840834 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f"} err="failed to get container status \"61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f\": rpc error: code = NotFound desc = could not find container \"61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f\": container with ID starting with 61f026e178dc4b076d98bf224df93f921d5570abbca244e901b3b73217f1df3f not found: ID does not exist" Nov 29 03:01:21 crc kubenswrapper[4749]: I1129 03:01:21.092983 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" path="/var/lib/kubelet/pods/f138b353-d1d5-4706-af92-e1ad0db0196d/volumes" Nov 29 03:01:23 crc kubenswrapper[4749]: I1129 03:01:23.132284 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-zpzjj"] Nov 29 03:01:23 crc kubenswrapper[4749]: I1129 03:01:23.144318 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-zpzjj"] Nov 29 03:01:24 crc kubenswrapper[4749]: I1129 03:01:24.559387 4749 scope.go:117] "RemoveContainer" containerID="7e66cacb1b10ff2c01335394b929b95d45b30bbbe57583c1207b7f987d3f79eb" Nov 29 03:01:24 crc kubenswrapper[4749]: I1129 03:01:24.604672 4749 scope.go:117] "RemoveContainer" containerID="bc91c4659f52834f20f498c7df37cd086edb1cb966ea6bd1d722b8843c472a18" Nov 29 03:01:24 crc kubenswrapper[4749]: I1129 03:01:24.673832 4749 scope.go:117] "RemoveContainer" containerID="9f3bd2523993ca220cd187bec20bbccb45f18e29e61d18375d06104e85698abc" Nov 29 03:01:24 crc kubenswrapper[4749]: I1129 03:01:24.716578 4749 scope.go:117] "RemoveContainer" containerID="7d6c57c39853bb11cf1146ecb503929fbae62b7194c08e1e512055fc5f72f227" Nov 29 03:01:25 crc kubenswrapper[4749]: I1129 03:01:25.092439 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f45513c9-263f-4a5b-9cba-e3412e0ac46d" path="/var/lib/kubelet/pods/f45513c9-263f-4a5b-9cba-e3412e0ac46d/volumes" Nov 29 03:01:32 crc kubenswrapper[4749]: I1129 03:01:32.075339 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:01:32 crc kubenswrapper[4749]: E1129 03:01:32.076690 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:01:47 crc kubenswrapper[4749]: I1129 03:01:47.095074 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:01:47 crc kubenswrapper[4749]: E1129 03:01:47.110494 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.385715 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vvp7x"] Nov 29 03:01:56 crc kubenswrapper[4749]: E1129 03:01:56.387438 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="extract-utilities" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.387477 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="extract-utilities" Nov 29 03:01:56 crc kubenswrapper[4749]: E1129 03:01:56.387499 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9347885d-8de5-4420-a979-c96f8e80d931" containerName="keystone-cron" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.387515 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9347885d-8de5-4420-a979-c96f8e80d931" containerName="keystone-cron" Nov 29 03:01:56 crc kubenswrapper[4749]: E1129 03:01:56.387541 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="registry-server" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.387557 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="registry-server" Nov 29 03:01:56 crc kubenswrapper[4749]: E1129 03:01:56.387605 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="extract-content" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.387623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="extract-content" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.388251 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f138b353-d1d5-4706-af92-e1ad0db0196d" containerName="registry-server" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.388367 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9347885d-8de5-4420-a979-c96f8e80d931" containerName="keystone-cron" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.392238 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.401996 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvp7x"] Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.494249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-catalog-content\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.494344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-utilities\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.494470 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwzhf\" (UniqueName: \"kubernetes.io/projected/b29571c3-bfb9-419a-a2a4-b86896f26ae2-kube-api-access-xwzhf\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.596451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwzhf\" (UniqueName: \"kubernetes.io/projected/b29571c3-bfb9-419a-a2a4-b86896f26ae2-kube-api-access-xwzhf\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.596601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-catalog-content\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.596657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-utilities\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.597131 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-utilities\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.597235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-catalog-content\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.628770 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwzhf\" (UniqueName: \"kubernetes.io/projected/b29571c3-bfb9-419a-a2a4-b86896f26ae2-kube-api-access-xwzhf\") pod \"community-operators-vvp7x\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:56 crc kubenswrapper[4749]: I1129 03:01:56.738450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:01:57 crc kubenswrapper[4749]: I1129 03:01:57.218087 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvp7x"] Nov 29 03:01:57 crc kubenswrapper[4749]: W1129 03:01:57.221651 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb29571c3_bfb9_419a_a2a4_b86896f26ae2.slice/crio-b8258ab7db5eaeb05d5fa2ce679c76e05130e7f2fd05b025d2af38c8ff532415 WatchSource:0}: Error finding container b8258ab7db5eaeb05d5fa2ce679c76e05130e7f2fd05b025d2af38c8ff532415: Status 404 returned error can't find the container with id b8258ab7db5eaeb05d5fa2ce679c76e05130e7f2fd05b025d2af38c8ff532415 Nov 29 03:01:58 crc kubenswrapper[4749]: I1129 03:01:58.228520 4749 generic.go:334] "Generic (PLEG): container finished" podID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerID="efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162" exitCode=0 Nov 29 03:01:58 crc kubenswrapper[4749]: I1129 03:01:58.228608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvp7x" event={"ID":"b29571c3-bfb9-419a-a2a4-b86896f26ae2","Type":"ContainerDied","Data":"efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162"} Nov 29 03:01:58 crc kubenswrapper[4749]: I1129 03:01:58.228861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvp7x" event={"ID":"b29571c3-bfb9-419a-a2a4-b86896f26ae2","Type":"ContainerStarted","Data":"b8258ab7db5eaeb05d5fa2ce679c76e05130e7f2fd05b025d2af38c8ff532415"} Nov 29 03:01:58 crc kubenswrapper[4749]: I1129 03:01:58.231303 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:01:59 crc kubenswrapper[4749]: I1129 03:01:59.239523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvp7x" event={"ID":"b29571c3-bfb9-419a-a2a4-b86896f26ae2","Type":"ContainerStarted","Data":"dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95"} Nov 29 03:02:00 crc kubenswrapper[4749]: I1129 03:02:00.074775 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:02:00 crc kubenswrapper[4749]: E1129 03:02:00.075537 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:02:00 crc kubenswrapper[4749]: I1129 03:02:00.253960 4749 generic.go:334] "Generic (PLEG): container finished" podID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerID="dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95" exitCode=0 Nov 29 03:02:00 crc kubenswrapper[4749]: I1129 03:02:00.254011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvp7x" event={"ID":"b29571c3-bfb9-419a-a2a4-b86896f26ae2","Type":"ContainerDied","Data":"dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95"} Nov 29 03:02:02 crc kubenswrapper[4749]: I1129 03:02:02.279650 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvp7x" event={"ID":"b29571c3-bfb9-419a-a2a4-b86896f26ae2","Type":"ContainerStarted","Data":"fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea"} Nov 29 03:02:02 crc kubenswrapper[4749]: I1129 03:02:02.315880 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vvp7x" podStartSLOduration=3.114774013 podStartE2EDuration="6.315853417s" podCreationTimestamp="2025-11-29 03:01:56 +0000 UTC" firstStartedPulling="2025-11-29 03:01:58.230926503 +0000 UTC m=+6661.403076370" lastFinishedPulling="2025-11-29 03:02:01.432005877 +0000 UTC m=+6664.604155774" observedRunningTime="2025-11-29 03:02:02.303540448 +0000 UTC m=+6665.475690355" watchObservedRunningTime="2025-11-29 03:02:02.315853417 +0000 UTC m=+6665.488003314" Nov 29 03:02:06 crc kubenswrapper[4749]: I1129 03:02:06.739943 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:02:06 crc kubenswrapper[4749]: I1129 03:02:06.740474 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:02:06 crc kubenswrapper[4749]: I1129 03:02:06.824819 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:02:07 crc kubenswrapper[4749]: I1129 03:02:07.423982 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:02:07 crc kubenswrapper[4749]: I1129 03:02:07.490314 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vvp7x"] Nov 29 03:02:09 crc kubenswrapper[4749]: I1129 03:02:09.363399 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vvp7x" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerName="registry-server" containerID="cri-o://fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea" gracePeriod=2 Nov 29 03:02:09 crc kubenswrapper[4749]: I1129 03:02:09.904908 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.050803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-utilities\") pod \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.051176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwzhf\" (UniqueName: \"kubernetes.io/projected/b29571c3-bfb9-419a-a2a4-b86896f26ae2-kube-api-access-xwzhf\") pod \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.051631 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-catalog-content\") pod \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\" (UID: \"b29571c3-bfb9-419a-a2a4-b86896f26ae2\") " Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.051895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-utilities" (OuterVolumeSpecName: "utilities") pod "b29571c3-bfb9-419a-a2a4-b86896f26ae2" (UID: "b29571c3-bfb9-419a-a2a4-b86896f26ae2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.053444 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.061739 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29571c3-bfb9-419a-a2a4-b86896f26ae2-kube-api-access-xwzhf" (OuterVolumeSpecName: "kube-api-access-xwzhf") pod "b29571c3-bfb9-419a-a2a4-b86896f26ae2" (UID: "b29571c3-bfb9-419a-a2a4-b86896f26ae2"). InnerVolumeSpecName "kube-api-access-xwzhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.118302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b29571c3-bfb9-419a-a2a4-b86896f26ae2" (UID: "b29571c3-bfb9-419a-a2a4-b86896f26ae2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.155893 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwzhf\" (UniqueName: \"kubernetes.io/projected/b29571c3-bfb9-419a-a2a4-b86896f26ae2-kube-api-access-xwzhf\") on node \"crc\" DevicePath \"\"" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.155953 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b29571c3-bfb9-419a-a2a4-b86896f26ae2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.373589 4749 generic.go:334] "Generic (PLEG): container finished" podID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerID="fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea" exitCode=0 Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.373632 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvp7x" event={"ID":"b29571c3-bfb9-419a-a2a4-b86896f26ae2","Type":"ContainerDied","Data":"fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea"} Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.373661 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvp7x" event={"ID":"b29571c3-bfb9-419a-a2a4-b86896f26ae2","Type":"ContainerDied","Data":"b8258ab7db5eaeb05d5fa2ce679c76e05130e7f2fd05b025d2af38c8ff532415"} Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.373679 4749 scope.go:117] "RemoveContainer" containerID="fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.373685 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvp7x" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.403221 4749 scope.go:117] "RemoveContainer" containerID="dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.428086 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vvp7x"] Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.438233 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vvp7x"] Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.448324 4749 scope.go:117] "RemoveContainer" containerID="efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.490637 4749 scope.go:117] "RemoveContainer" containerID="fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea" Nov 29 03:02:10 crc kubenswrapper[4749]: E1129 03:02:10.491111 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea\": container with ID starting with fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea not found: ID does not exist" containerID="fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.491142 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea"} err="failed to get container status \"fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea\": rpc error: code = NotFound desc = could not find container \"fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea\": container with ID starting with fa73eb7e29b9f64b99377900b089242ce3dcc2363e84e67bd793972d676760ea not found: ID does not exist" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.491169 4749 scope.go:117] "RemoveContainer" containerID="dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95" Nov 29 03:02:10 crc kubenswrapper[4749]: E1129 03:02:10.491638 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95\": container with ID starting with dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95 not found: ID does not exist" containerID="dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.491661 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95"} err="failed to get container status \"dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95\": rpc error: code = NotFound desc = could not find container \"dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95\": container with ID starting with dfa7eeb1b355e1c9734b00c7cfbdf814250a57f6685f6836e8992b59c1fc2e95 not found: ID does not exist" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.491675 4749 scope.go:117] "RemoveContainer" containerID="efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162" Nov 29 03:02:10 crc kubenswrapper[4749]: E1129 03:02:10.492136 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162\": container with ID starting with efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162 not found: ID does not exist" containerID="efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162" Nov 29 03:02:10 crc kubenswrapper[4749]: I1129 03:02:10.492180 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162"} err="failed to get container status \"efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162\": rpc error: code = NotFound desc = could not find container \"efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162\": container with ID starting with efce884f88a48044e607afdd41be7b805061f33d5838f334ade8dedc2f38e162 not found: ID does not exist" Nov 29 03:02:11 crc kubenswrapper[4749]: I1129 03:02:11.075297 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:02:11 crc kubenswrapper[4749]: E1129 03:02:11.075760 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:02:11 crc kubenswrapper[4749]: I1129 03:02:11.085806 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" path="/var/lib/kubelet/pods/b29571c3-bfb9-419a-a2a4-b86896f26ae2/volumes" Nov 29 03:02:24 crc kubenswrapper[4749]: I1129 03:02:24.932902 4749 scope.go:117] "RemoveContainer" containerID="19f1bb570219e384c0b086ef6f7d9eb806a02574c2ccc627e647b48a6a809ad9" Nov 29 03:02:24 crc kubenswrapper[4749]: I1129 03:02:24.967143 4749 scope.go:117] "RemoveContainer" containerID="62c902af63a7eed153f838a027c15bd5e050f638857a916cc35c50b37d9c7f60" Nov 29 03:02:25 crc kubenswrapper[4749]: I1129 03:02:25.011434 4749 scope.go:117] "RemoveContainer" containerID="23fb169c73ef17503ee8c859a534f7051825dc16f46dac76f6d21c0f650f1620" Nov 29 03:02:25 crc kubenswrapper[4749]: I1129 03:02:25.130819 4749 scope.go:117] "RemoveContainer" containerID="f3f8f05a44984ecd1d427d13ca004f02a81b0ad058a17e4468ecba4ac8d429bf" Nov 29 03:02:25 crc kubenswrapper[4749]: I1129 03:02:25.154879 4749 scope.go:117] "RemoveContainer" containerID="fbc2e75d6ab9a5f7f585c4986ee37b0078c1050d45562a709893ad49ddf577a0" Nov 29 03:02:25 crc kubenswrapper[4749]: I1129 03:02:25.188671 4749 scope.go:117] "RemoveContainer" containerID="ba99f5ee9386ea3c4abb3a01a964b876a74380b9523ffa0286363aa8e094612a" Nov 29 03:02:26 crc kubenswrapper[4749]: I1129 03:02:26.076022 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:02:26 crc kubenswrapper[4749]: I1129 03:02:26.582744 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"10af378cf1649d26aedc5bb5143c48567ce072ddd252d8dd3b04b9d0236f27e9"} Nov 29 03:04:09 crc kubenswrapper[4749]: I1129 03:04:09.067966 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-5b01-account-create-update-ljxkt"] Nov 29 03:04:09 crc kubenswrapper[4749]: I1129 03:04:09.101746 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-cxgwg"] Nov 29 03:04:09 crc kubenswrapper[4749]: I1129 03:04:09.105141 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-5b01-account-create-update-ljxkt"] Nov 29 03:04:09 crc kubenswrapper[4749]: I1129 03:04:09.117491 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-cxgwg"] Nov 29 03:04:11 crc kubenswrapper[4749]: I1129 03:04:11.092509 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff1338c-165a-47af-ad4e-275d7b90dd87" path="/var/lib/kubelet/pods/3ff1338c-165a-47af-ad4e-275d7b90dd87/volumes" Nov 29 03:04:11 crc kubenswrapper[4749]: I1129 03:04:11.093709 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476687b6-9c04-4736-ad82-900e844ea6be" path="/var/lib/kubelet/pods/476687b6-9c04-4736-ad82-900e844ea6be/volumes" Nov 29 03:04:24 crc kubenswrapper[4749]: I1129 03:04:24.035447 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-lvfcj"] Nov 29 03:04:24 crc kubenswrapper[4749]: I1129 03:04:24.044049 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-lvfcj"] Nov 29 03:04:25 crc kubenswrapper[4749]: I1129 03:04:25.101365 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304202b2-d372-4ab5-95f7-b77b94748b1a" path="/var/lib/kubelet/pods/304202b2-d372-4ab5-95f7-b77b94748b1a/volumes" Nov 29 03:04:25 crc kubenswrapper[4749]: I1129 03:04:25.331067 4749 scope.go:117] "RemoveContainer" containerID="1a88eedfacc04edc798be76f032f37259b34203383e686ac112a0ca43b037bbe" Nov 29 03:04:25 crc kubenswrapper[4749]: I1129 03:04:25.391980 4749 scope.go:117] "RemoveContainer" containerID="486e18a6d2bae3cbb3c65ff282a069114dde06569453785301ea7f164ce9c773" Nov 29 03:04:25 crc kubenswrapper[4749]: I1129 03:04:25.433633 4749 scope.go:117] "RemoveContainer" containerID="b11465d1097dbefe618db5f6c2af03a9a67ed514c1c471dc3adeb7aedd44714a" Nov 29 03:04:55 crc kubenswrapper[4749]: I1129 03:04:55.374650 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:04:55 crc kubenswrapper[4749]: I1129 03:04:55.375333 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:05:25 crc kubenswrapper[4749]: I1129 03:05:25.374602 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:05:25 crc kubenswrapper[4749]: I1129 03:05:25.375522 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:05:55 crc kubenswrapper[4749]: I1129 03:05:55.374179 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:05:55 crc kubenswrapper[4749]: I1129 03:05:55.375014 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:05:55 crc kubenswrapper[4749]: I1129 03:05:55.375083 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:05:55 crc kubenswrapper[4749]: I1129 03:05:55.376357 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10af378cf1649d26aedc5bb5143c48567ce072ddd252d8dd3b04b9d0236f27e9"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:05:55 crc kubenswrapper[4749]: I1129 03:05:55.376462 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://10af378cf1649d26aedc5bb5143c48567ce072ddd252d8dd3b04b9d0236f27e9" gracePeriod=600 Nov 29 03:05:56 crc kubenswrapper[4749]: I1129 03:05:56.187071 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="10af378cf1649d26aedc5bb5143c48567ce072ddd252d8dd3b04b9d0236f27e9" exitCode=0 Nov 29 03:05:56 crc kubenswrapper[4749]: I1129 03:05:56.187121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"10af378cf1649d26aedc5bb5143c48567ce072ddd252d8dd3b04b9d0236f27e9"} Nov 29 03:05:56 crc kubenswrapper[4749]: I1129 03:05:56.188162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f"} Nov 29 03:05:56 crc kubenswrapper[4749]: I1129 03:05:56.188240 4749 scope.go:117] "RemoveContainer" containerID="3c54e9bf8c3a20625e46654b7d64805fd7d1fae8cd6d2131b9b3a384a0578d48" Nov 29 03:06:21 crc kubenswrapper[4749]: I1129 03:06:21.069379 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-nkzq6"] Nov 29 03:06:21 crc kubenswrapper[4749]: I1129 03:06:21.105375 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-32db-account-create-update-k6bvn"] Nov 29 03:06:21 crc kubenswrapper[4749]: I1129 03:06:21.111752 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-nkzq6"] Nov 29 03:06:21 crc kubenswrapper[4749]: I1129 03:06:21.133006 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-32db-account-create-update-k6bvn"] Nov 29 03:06:23 crc kubenswrapper[4749]: I1129 03:06:23.090248 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2831df11-ac96-49f4-9669-9c549e61a190" path="/var/lib/kubelet/pods/2831df11-ac96-49f4-9669-9c549e61a190/volumes" Nov 29 03:06:23 crc kubenswrapper[4749]: I1129 03:06:23.091138 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1e0df9-b788-4227-a867-61420e8494f5" path="/var/lib/kubelet/pods/7e1e0df9-b788-4227-a867-61420e8494f5/volumes" Nov 29 03:06:25 crc kubenswrapper[4749]: I1129 03:06:25.574410 4749 scope.go:117] "RemoveContainer" containerID="77d01ee553bfe049ee0f3c19ea92b4455d0c514e96980994730f84588ede205d" Nov 29 03:06:25 crc kubenswrapper[4749]: I1129 03:06:25.635880 4749 scope.go:117] "RemoveContainer" containerID="a1fa6836123b024905c592bf3fe11a5ca39ccbbb57fff5711a6c40c2e802b1f4" Nov 29 03:06:32 crc kubenswrapper[4749]: I1129 03:06:32.047170 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-8ccv8"] Nov 29 03:06:32 crc kubenswrapper[4749]: I1129 03:06:32.066425 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-8ccv8"] Nov 29 03:06:33 crc kubenswrapper[4749]: I1129 03:06:33.096424 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881ef581-f476-4e44-b01f-797f4fa23d1f" path="/var/lib/kubelet/pods/881ef581-f476-4e44-b01f-797f4fa23d1f/volumes" Nov 29 03:06:56 crc kubenswrapper[4749]: I1129 03:06:56.040754 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-4c2zm"] Nov 29 03:06:56 crc kubenswrapper[4749]: I1129 03:06:56.061245 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-0243-account-create-update-r6mf6"] Nov 29 03:06:56 crc kubenswrapper[4749]: I1129 03:06:56.081403 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-4c2zm"] Nov 29 03:06:56 crc kubenswrapper[4749]: I1129 03:06:56.091307 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-0243-account-create-update-r6mf6"] Nov 29 03:06:57 crc kubenswrapper[4749]: I1129 03:06:57.097815 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2799ee87-ddc8-4e43-abbb-26744d666d22" path="/var/lib/kubelet/pods/2799ee87-ddc8-4e43-abbb-26744d666d22/volumes" Nov 29 03:06:57 crc kubenswrapper[4749]: I1129 03:06:57.098939 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f86295e-c261-49f4-ae57-0c975f06c73b" path="/var/lib/kubelet/pods/7f86295e-c261-49f4-ae57-0c975f06c73b/volumes" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.152139 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zhdlz"] Nov 29 03:07:04 crc kubenswrapper[4749]: E1129 03:07:04.153562 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerName="extract-utilities" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.153582 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerName="extract-utilities" Nov 29 03:07:04 crc kubenswrapper[4749]: E1129 03:07:04.153600 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerName="extract-content" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.153609 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerName="extract-content" Nov 29 03:07:04 crc kubenswrapper[4749]: E1129 03:07:04.153641 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerName="registry-server" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.153649 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerName="registry-server" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.153978 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29571c3-bfb9-419a-a2a4-b86896f26ae2" containerName="registry-server" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.157597 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.195756 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhdlz"] Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.291175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-catalog-content\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.291241 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbtfh\" (UniqueName: \"kubernetes.io/projected/4a20c353-7406-4213-a7b5-3959ab2c7f8d-kube-api-access-cbtfh\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.291276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-utilities\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.392883 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-catalog-content\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.392937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbtfh\" (UniqueName: \"kubernetes.io/projected/4a20c353-7406-4213-a7b5-3959ab2c7f8d-kube-api-access-cbtfh\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.392971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-utilities\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.393454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-catalog-content\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.395089 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-utilities\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.413860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbtfh\" (UniqueName: \"kubernetes.io/projected/4a20c353-7406-4213-a7b5-3959ab2c7f8d-kube-api-access-cbtfh\") pod \"certified-operators-zhdlz\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:04 crc kubenswrapper[4749]: I1129 03:07:04.495408 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:05 crc kubenswrapper[4749]: I1129 03:07:05.008099 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhdlz"] Nov 29 03:07:05 crc kubenswrapper[4749]: I1129 03:07:05.061166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhdlz" event={"ID":"4a20c353-7406-4213-a7b5-3959ab2c7f8d","Type":"ContainerStarted","Data":"57703b03b7b3f2c43b404b48118ea0a942b9ed0a4a8af475126a55359b8e7d9e"} Nov 29 03:07:06 crc kubenswrapper[4749]: I1129 03:07:06.054124 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-5ljqf"] Nov 29 03:07:06 crc kubenswrapper[4749]: I1129 03:07:06.075676 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-5ljqf"] Nov 29 03:07:06 crc kubenswrapper[4749]: I1129 03:07:06.082382 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerID="886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1" exitCode=0 Nov 29 03:07:06 crc kubenswrapper[4749]: I1129 03:07:06.082422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhdlz" event={"ID":"4a20c353-7406-4213-a7b5-3959ab2c7f8d","Type":"ContainerDied","Data":"886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1"} Nov 29 03:07:06 crc kubenswrapper[4749]: I1129 03:07:06.084796 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:07:07 crc kubenswrapper[4749]: I1129 03:07:07.105507 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92025c44-2545-4dc1-82f8-822ce5da38d6" path="/var/lib/kubelet/pods/92025c44-2545-4dc1-82f8-822ce5da38d6/volumes" Nov 29 03:07:07 crc kubenswrapper[4749]: I1129 03:07:07.108267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhdlz" event={"ID":"4a20c353-7406-4213-a7b5-3959ab2c7f8d","Type":"ContainerStarted","Data":"a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e"} Nov 29 03:07:09 crc kubenswrapper[4749]: I1129 03:07:09.141290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhdlz" event={"ID":"4a20c353-7406-4213-a7b5-3959ab2c7f8d","Type":"ContainerDied","Data":"a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e"} Nov 29 03:07:09 crc kubenswrapper[4749]: I1129 03:07:09.141149 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerID="a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e" exitCode=0 Nov 29 03:07:10 crc kubenswrapper[4749]: I1129 03:07:10.160279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhdlz" event={"ID":"4a20c353-7406-4213-a7b5-3959ab2c7f8d","Type":"ContainerStarted","Data":"949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40"} Nov 29 03:07:10 crc kubenswrapper[4749]: I1129 03:07:10.196567 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zhdlz" podStartSLOduration=2.624992502 podStartE2EDuration="6.196542207s" podCreationTimestamp="2025-11-29 03:07:04 +0000 UTC" firstStartedPulling="2025-11-29 03:07:06.084611118 +0000 UTC m=+6969.256760975" lastFinishedPulling="2025-11-29 03:07:09.656160823 +0000 UTC m=+6972.828310680" observedRunningTime="2025-11-29 03:07:10.187040866 +0000 UTC m=+6973.359190783" watchObservedRunningTime="2025-11-29 03:07:10.196542207 +0000 UTC m=+6973.368692094" Nov 29 03:07:14 crc kubenswrapper[4749]: I1129 03:07:14.496157 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:14 crc kubenswrapper[4749]: I1129 03:07:14.498752 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:14 crc kubenswrapper[4749]: I1129 03:07:14.559719 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:15 crc kubenswrapper[4749]: I1129 03:07:15.374795 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:15 crc kubenswrapper[4749]: I1129 03:07:15.464184 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhdlz"] Nov 29 03:07:17 crc kubenswrapper[4749]: I1129 03:07:17.310021 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zhdlz" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerName="registry-server" containerID="cri-o://949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40" gracePeriod=2 Nov 29 03:07:17 crc kubenswrapper[4749]: I1129 03:07:17.904496 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.049431 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-catalog-content\") pod \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.049572 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-utilities\") pod \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.049688 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbtfh\" (UniqueName: \"kubernetes.io/projected/4a20c353-7406-4213-a7b5-3959ab2c7f8d-kube-api-access-cbtfh\") pod \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\" (UID: \"4a20c353-7406-4213-a7b5-3959ab2c7f8d\") " Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.050436 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-utilities" (OuterVolumeSpecName: "utilities") pod "4a20c353-7406-4213-a7b5-3959ab2c7f8d" (UID: "4a20c353-7406-4213-a7b5-3959ab2c7f8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.050967 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.064360 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a20c353-7406-4213-a7b5-3959ab2c7f8d-kube-api-access-cbtfh" (OuterVolumeSpecName: "kube-api-access-cbtfh") pod "4a20c353-7406-4213-a7b5-3959ab2c7f8d" (UID: "4a20c353-7406-4213-a7b5-3959ab2c7f8d"). InnerVolumeSpecName "kube-api-access-cbtfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.101769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a20c353-7406-4213-a7b5-3959ab2c7f8d" (UID: "4a20c353-7406-4213-a7b5-3959ab2c7f8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.152948 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbtfh\" (UniqueName: \"kubernetes.io/projected/4a20c353-7406-4213-a7b5-3959ab2c7f8d-kube-api-access-cbtfh\") on node \"crc\" DevicePath \"\"" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.152982 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a20c353-7406-4213-a7b5-3959ab2c7f8d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.320708 4749 generic.go:334] "Generic (PLEG): container finished" podID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerID="949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40" exitCode=0 Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.320771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhdlz" event={"ID":"4a20c353-7406-4213-a7b5-3959ab2c7f8d","Type":"ContainerDied","Data":"949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40"} Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.320813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhdlz" event={"ID":"4a20c353-7406-4213-a7b5-3959ab2c7f8d","Type":"ContainerDied","Data":"57703b03b7b3f2c43b404b48118ea0a942b9ed0a4a8af475126a55359b8e7d9e"} Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.320841 4749 scope.go:117] "RemoveContainer" containerID="949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.320777 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhdlz" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.354462 4749 scope.go:117] "RemoveContainer" containerID="a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.366669 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhdlz"] Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.379273 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zhdlz"] Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.390681 4749 scope.go:117] "RemoveContainer" containerID="886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.459712 4749 scope.go:117] "RemoveContainer" containerID="949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40" Nov 29 03:07:18 crc kubenswrapper[4749]: E1129 03:07:18.460316 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40\": container with ID starting with 949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40 not found: ID does not exist" containerID="949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.460369 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40"} err="failed to get container status \"949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40\": rpc error: code = NotFound desc = could not find container \"949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40\": container with ID starting with 949d4b09af56d29fa1c74a5c007731c99d2d2560a0ad6b1793993cf207063e40 not found: ID does not exist" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.460407 4749 scope.go:117] "RemoveContainer" containerID="a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e" Nov 29 03:07:18 crc kubenswrapper[4749]: E1129 03:07:18.460758 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e\": container with ID starting with a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e not found: ID does not exist" containerID="a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.460781 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e"} err="failed to get container status \"a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e\": rpc error: code = NotFound desc = could not find container \"a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e\": container with ID starting with a319e78431ea613f1ae8b0c80831b02084d6072133d4b6a619dac9b28a2d171e not found: ID does not exist" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.460794 4749 scope.go:117] "RemoveContainer" containerID="886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1" Nov 29 03:07:18 crc kubenswrapper[4749]: E1129 03:07:18.461030 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1\": container with ID starting with 886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1 not found: ID does not exist" containerID="886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1" Nov 29 03:07:18 crc kubenswrapper[4749]: I1129 03:07:18.461044 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1"} err="failed to get container status \"886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1\": rpc error: code = NotFound desc = could not find container \"886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1\": container with ID starting with 886d121e38f32cd283f14a39400abb304ca984efd55d6b1531bb8565420c03b1 not found: ID does not exist" Nov 29 03:07:19 crc kubenswrapper[4749]: I1129 03:07:19.100761 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" path="/var/lib/kubelet/pods/4a20c353-7406-4213-a7b5-3959ab2c7f8d/volumes" Nov 29 03:07:25 crc kubenswrapper[4749]: I1129 03:07:25.742772 4749 scope.go:117] "RemoveContainer" containerID="8df11921057fe81797b26a62a65f836e299eaf6d2bf6468123ef8d1956082de2" Nov 29 03:07:25 crc kubenswrapper[4749]: I1129 03:07:25.807078 4749 scope.go:117] "RemoveContainer" containerID="de2258c4a488bd1526cd672041c9cd0cf7e3fefbc5abd1dcd36f07790100992b" Nov 29 03:07:25 crc kubenswrapper[4749]: I1129 03:07:25.854219 4749 scope.go:117] "RemoveContainer" containerID="7b70211361ce7ecc3c19913935fbb34ebe80deb787d90dfe63df6fa5d8b02181" Nov 29 03:07:25 crc kubenswrapper[4749]: I1129 03:07:25.930380 4749 scope.go:117] "RemoveContainer" containerID="ff6291ff7f4fdeec2d4f0802ca167a7fb809ae0288ebf57afcf4936a8e1823af" Nov 29 03:07:55 crc kubenswrapper[4749]: I1129 03:07:55.374116 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:07:55 crc kubenswrapper[4749]: I1129 03:07:55.376054 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:08:25 crc kubenswrapper[4749]: I1129 03:08:25.374740 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:08:25 crc kubenswrapper[4749]: I1129 03:08:25.375595 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.146391 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xkkhq"] Nov 29 03:08:35 crc kubenswrapper[4749]: E1129 03:08:35.147667 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerName="extract-content" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.147688 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerName="extract-content" Nov 29 03:08:35 crc kubenswrapper[4749]: E1129 03:08:35.147735 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerName="registry-server" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.147746 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerName="registry-server" Nov 29 03:08:35 crc kubenswrapper[4749]: E1129 03:08:35.147794 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerName="extract-utilities" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.147805 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerName="extract-utilities" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.148147 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a20c353-7406-4213-a7b5-3959ab2c7f8d" containerName="registry-server" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.150968 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.158672 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkkhq"] Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.262800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-utilities\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.262871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-catalog-content\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.262923 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565rs\" (UniqueName: \"kubernetes.io/projected/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-kube-api-access-565rs\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.365488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-catalog-content\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.365554 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565rs\" (UniqueName: \"kubernetes.io/projected/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-kube-api-access-565rs\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.365877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-utilities\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.366084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-catalog-content\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.366449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-utilities\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.397098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565rs\" (UniqueName: \"kubernetes.io/projected/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-kube-api-access-565rs\") pod \"redhat-marketplace-xkkhq\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:35 crc kubenswrapper[4749]: I1129 03:08:35.491817 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:36 crc kubenswrapper[4749]: I1129 03:08:36.005824 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkkhq"] Nov 29 03:08:36 crc kubenswrapper[4749]: I1129 03:08:36.359432 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerID="bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0" exitCode=0 Nov 29 03:08:36 crc kubenswrapper[4749]: I1129 03:08:36.359513 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkkhq" event={"ID":"fc5f84c0-1b33-4f2d-9fec-ea975af012c3","Type":"ContainerDied","Data":"bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0"} Nov 29 03:08:36 crc kubenswrapper[4749]: I1129 03:08:36.359841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkkhq" event={"ID":"fc5f84c0-1b33-4f2d-9fec-ea975af012c3","Type":"ContainerStarted","Data":"98443ca311a18abf380fc8a0833c1f19b004937a1eb39049180420bd3b7c3752"} Nov 29 03:08:36 crc kubenswrapper[4749]: E1129 03:08:36.440873 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc5f84c0_1b33_4f2d_9fec_ea975af012c3.slice/crio-conmon-bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc5f84c0_1b33_4f2d_9fec_ea975af012c3.slice/crio-bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0.scope\": RecentStats: unable to find data in memory cache]" Nov 29 03:08:38 crc kubenswrapper[4749]: I1129 03:08:38.383986 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerID="3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50" exitCode=0 Nov 29 03:08:38 crc kubenswrapper[4749]: I1129 03:08:38.384089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkkhq" event={"ID":"fc5f84c0-1b33-4f2d-9fec-ea975af012c3","Type":"ContainerDied","Data":"3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50"} Nov 29 03:08:39 crc kubenswrapper[4749]: I1129 03:08:39.396933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkkhq" event={"ID":"fc5f84c0-1b33-4f2d-9fec-ea975af012c3","Type":"ContainerStarted","Data":"eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded"} Nov 29 03:08:39 crc kubenswrapper[4749]: I1129 03:08:39.420983 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xkkhq" podStartSLOduration=1.7912805170000001 podStartE2EDuration="4.420958264s" podCreationTimestamp="2025-11-29 03:08:35 +0000 UTC" firstStartedPulling="2025-11-29 03:08:36.364296155 +0000 UTC m=+7059.536446032" lastFinishedPulling="2025-11-29 03:08:38.993973912 +0000 UTC m=+7062.166123779" observedRunningTime="2025-11-29 03:08:39.416452245 +0000 UTC m=+7062.588602152" watchObservedRunningTime="2025-11-29 03:08:39.420958264 +0000 UTC m=+7062.593108161" Nov 29 03:08:45 crc kubenswrapper[4749]: I1129 03:08:45.492192 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:45 crc kubenswrapper[4749]: I1129 03:08:45.492939 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:45 crc kubenswrapper[4749]: I1129 03:08:45.547657 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:46 crc kubenswrapper[4749]: I1129 03:08:46.532504 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:46 crc kubenswrapper[4749]: I1129 03:08:46.583036 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkkhq"] Nov 29 03:08:48 crc kubenswrapper[4749]: I1129 03:08:48.502044 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xkkhq" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerName="registry-server" containerID="cri-o://eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded" gracePeriod=2 Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.087795 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.189569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-utilities\") pod \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.189877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-catalog-content\") pod \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.190039 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-565rs\" (UniqueName: \"kubernetes.io/projected/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-kube-api-access-565rs\") pod \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\" (UID: \"fc5f84c0-1b33-4f2d-9fec-ea975af012c3\") " Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.190510 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-utilities" (OuterVolumeSpecName: "utilities") pod "fc5f84c0-1b33-4f2d-9fec-ea975af012c3" (UID: "fc5f84c0-1b33-4f2d-9fec-ea975af012c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.190775 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.195067 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-kube-api-access-565rs" (OuterVolumeSpecName: "kube-api-access-565rs") pod "fc5f84c0-1b33-4f2d-9fec-ea975af012c3" (UID: "fc5f84c0-1b33-4f2d-9fec-ea975af012c3"). InnerVolumeSpecName "kube-api-access-565rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.212567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc5f84c0-1b33-4f2d-9fec-ea975af012c3" (UID: "fc5f84c0-1b33-4f2d-9fec-ea975af012c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.295949 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-565rs\" (UniqueName: \"kubernetes.io/projected/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-kube-api-access-565rs\") on node \"crc\" DevicePath \"\"" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.295987 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc5f84c0-1b33-4f2d-9fec-ea975af012c3-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.529456 4749 generic.go:334] "Generic (PLEG): container finished" podID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerID="eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded" exitCode=0 Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.529503 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkkhq" event={"ID":"fc5f84c0-1b33-4f2d-9fec-ea975af012c3","Type":"ContainerDied","Data":"eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded"} Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.529533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xkkhq" event={"ID":"fc5f84c0-1b33-4f2d-9fec-ea975af012c3","Type":"ContainerDied","Data":"98443ca311a18abf380fc8a0833c1f19b004937a1eb39049180420bd3b7c3752"} Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.529553 4749 scope.go:117] "RemoveContainer" containerID="eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.529685 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xkkhq" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.572747 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkkhq"] Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.574022 4749 scope.go:117] "RemoveContainer" containerID="3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.583657 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xkkhq"] Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.601420 4749 scope.go:117] "RemoveContainer" containerID="bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.661840 4749 scope.go:117] "RemoveContainer" containerID="eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded" Nov 29 03:08:49 crc kubenswrapper[4749]: E1129 03:08:49.665431 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded\": container with ID starting with eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded not found: ID does not exist" containerID="eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.665483 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded"} err="failed to get container status \"eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded\": rpc error: code = NotFound desc = could not find container \"eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded\": container with ID starting with eb9fc7955acac351e81fac2b6e2345b7e4eb3728751c74405a887419294aaded not found: ID does not exist" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.665517 4749 scope.go:117] "RemoveContainer" containerID="3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50" Nov 29 03:08:49 crc kubenswrapper[4749]: E1129 03:08:49.666063 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50\": container with ID starting with 3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50 not found: ID does not exist" containerID="3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.666176 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50"} err="failed to get container status \"3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50\": rpc error: code = NotFound desc = could not find container \"3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50\": container with ID starting with 3b788d4bf9da90ec4bc4cfeb217dde02dbda4e2bba0f9e0380dddb15855b6a50 not found: ID does not exist" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.666304 4749 scope.go:117] "RemoveContainer" containerID="bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0" Nov 29 03:08:49 crc kubenswrapper[4749]: E1129 03:08:49.666868 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0\": container with ID starting with bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0 not found: ID does not exist" containerID="bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0" Nov 29 03:08:49 crc kubenswrapper[4749]: I1129 03:08:49.666902 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0"} err="failed to get container status \"bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0\": rpc error: code = NotFound desc = could not find container \"bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0\": container with ID starting with bef3e2bd53dd8b661f239563fa6f38c2f1e60b2a1db95e6cdafd57c7ecdb8bb0 not found: ID does not exist" Nov 29 03:08:51 crc kubenswrapper[4749]: I1129 03:08:51.096213 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" path="/var/lib/kubelet/pods/fc5f84c0-1b33-4f2d-9fec-ea975af012c3/volumes" Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.374294 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.374856 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.374903 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.375767 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.375829 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" gracePeriod=600 Nov 29 03:08:55 crc kubenswrapper[4749]: E1129 03:08:55.509412 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.599049 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" exitCode=0 Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.599098 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f"} Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.599133 4749 scope.go:117] "RemoveContainer" containerID="10af378cf1649d26aedc5bb5143c48567ce072ddd252d8dd3b04b9d0236f27e9" Nov 29 03:08:55 crc kubenswrapper[4749]: I1129 03:08:55.599954 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:08:55 crc kubenswrapper[4749]: E1129 03:08:55.600420 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:09:07 crc kubenswrapper[4749]: I1129 03:09:07.093840 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:09:07 crc kubenswrapper[4749]: E1129 03:09:07.095295 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:09:22 crc kubenswrapper[4749]: I1129 03:09:22.075948 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:09:22 crc kubenswrapper[4749]: E1129 03:09:22.077342 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:09:35 crc kubenswrapper[4749]: I1129 03:09:35.075649 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:09:35 crc kubenswrapper[4749]: E1129 03:09:35.076524 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:09:39 crc kubenswrapper[4749]: I1129 03:09:39.148592 4749 generic.go:334] "Generic (PLEG): container finished" podID="234ff02d-f844-46c2-9a13-9cc6ce370926" containerID="9226f96005dd431a7c029c8cba0a5bf4497abdc4939941c9e1d9b05ec0473a8e" exitCode=0 Nov 29 03:09:39 crc kubenswrapper[4749]: I1129 03:09:39.149117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" event={"ID":"234ff02d-f844-46c2-9a13-9cc6ce370926","Type":"ContainerDied","Data":"9226f96005dd431a7c029c8cba0a5bf4497abdc4939941c9e1d9b05ec0473a8e"} Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.671060 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.779256 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ceph\") pod \"234ff02d-f844-46c2-9a13-9cc6ce370926\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.779322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-inventory\") pod \"234ff02d-f844-46c2-9a13-9cc6ce370926\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.779394 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w55np\" (UniqueName: \"kubernetes.io/projected/234ff02d-f844-46c2-9a13-9cc6ce370926-kube-api-access-w55np\") pod \"234ff02d-f844-46c2-9a13-9cc6ce370926\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.779434 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ssh-key\") pod \"234ff02d-f844-46c2-9a13-9cc6ce370926\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.779567 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-tripleo-cleanup-combined-ca-bundle\") pod \"234ff02d-f844-46c2-9a13-9cc6ce370926\" (UID: \"234ff02d-f844-46c2-9a13-9cc6ce370926\") " Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.785006 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ceph" (OuterVolumeSpecName: "ceph") pod "234ff02d-f844-46c2-9a13-9cc6ce370926" (UID: "234ff02d-f844-46c2-9a13-9cc6ce370926"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.784941 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234ff02d-f844-46c2-9a13-9cc6ce370926-kube-api-access-w55np" (OuterVolumeSpecName: "kube-api-access-w55np") pod "234ff02d-f844-46c2-9a13-9cc6ce370926" (UID: "234ff02d-f844-46c2-9a13-9cc6ce370926"). InnerVolumeSpecName "kube-api-access-w55np". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.787386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "234ff02d-f844-46c2-9a13-9cc6ce370926" (UID: "234ff02d-f844-46c2-9a13-9cc6ce370926"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.812366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-inventory" (OuterVolumeSpecName: "inventory") pod "234ff02d-f844-46c2-9a13-9cc6ce370926" (UID: "234ff02d-f844-46c2-9a13-9cc6ce370926"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.815534 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "234ff02d-f844-46c2-9a13-9cc6ce370926" (UID: "234ff02d-f844-46c2-9a13-9cc6ce370926"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.881873 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.881909 4749 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.881920 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.881929 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/234ff02d-f844-46c2-9a13-9cc6ce370926-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:09:40 crc kubenswrapper[4749]: I1129 03:09:40.881941 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w55np\" (UniqueName: \"kubernetes.io/projected/234ff02d-f844-46c2-9a13-9cc6ce370926-kube-api-access-w55np\") on node \"crc\" DevicePath \"\"" Nov 29 03:09:41 crc kubenswrapper[4749]: I1129 03:09:41.171733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" event={"ID":"234ff02d-f844-46c2-9a13-9cc6ce370926","Type":"ContainerDied","Data":"7c5154a466864c2fa1a1679e07c9361036ad73985bacc26b0d16335bb1207339"} Nov 29 03:09:41 crc kubenswrapper[4749]: I1129 03:09:41.172064 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c5154a466864c2fa1a1679e07c9361036ad73985bacc26b0d16335bb1207339" Nov 29 03:09:41 crc kubenswrapper[4749]: I1129 03:09:41.171769 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5" Nov 29 03:09:46 crc kubenswrapper[4749]: I1129 03:09:46.075218 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:09:46 crc kubenswrapper[4749]: E1129 03:09:46.076020 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.071990 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-d2xwc"] Nov 29 03:09:49 crc kubenswrapper[4749]: E1129 03:09:49.073646 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerName="extract-content" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.073728 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerName="extract-content" Nov 29 03:09:49 crc kubenswrapper[4749]: E1129 03:09:49.073795 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerName="extract-utilities" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.073858 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerName="extract-utilities" Nov 29 03:09:49 crc kubenswrapper[4749]: E1129 03:09:49.073919 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234ff02d-f844-46c2-9a13-9cc6ce370926" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.073974 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="234ff02d-f844-46c2-9a13-9cc6ce370926" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 29 03:09:49 crc kubenswrapper[4749]: E1129 03:09:49.074054 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerName="registry-server" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.074111 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerName="registry-server" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.074373 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5f84c0-1b33-4f2d-9fec-ea975af012c3" containerName="registry-server" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.074462 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="234ff02d-f844-46c2-9a13-9cc6ce370926" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.075401 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.080106 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.080120 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.082348 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.085974 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.101366 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-d2xwc"] Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.150822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ceph\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.151258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.152884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl6rh\" (UniqueName: \"kubernetes.io/projected/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-kube-api-access-hl6rh\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.153018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.153586 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-inventory\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.255491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-inventory\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.255619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ceph\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.255657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.255706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl6rh\" (UniqueName: \"kubernetes.io/projected/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-kube-api-access-hl6rh\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.255745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.266022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-inventory\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.270229 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.273864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.275804 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ceph\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.281483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl6rh\" (UniqueName: \"kubernetes.io/projected/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-kube-api-access-hl6rh\") pod \"bootstrap-openstack-openstack-cell1-d2xwc\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.403884 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:09:49 crc kubenswrapper[4749]: I1129 03:09:49.958484 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-d2xwc"] Nov 29 03:09:50 crc kubenswrapper[4749]: I1129 03:09:50.727863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" event={"ID":"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6","Type":"ContainerStarted","Data":"5f2c3c3511d053fefcb89b2dcb4b57a7b1c5f0f1d37db3cb3d096a30c8d761c3"} Nov 29 03:09:51 crc kubenswrapper[4749]: I1129 03:09:51.742457 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" event={"ID":"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6","Type":"ContainerStarted","Data":"3cfba64a03abe8647dfb6656e3dd91c264828ca6b8d36200d0d688ee9d90c77d"} Nov 29 03:09:51 crc kubenswrapper[4749]: I1129 03:09:51.773679 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" podStartSLOduration=2.049444686 podStartE2EDuration="2.773658884s" podCreationTimestamp="2025-11-29 03:09:49 +0000 UTC" firstStartedPulling="2025-11-29 03:09:49.970618554 +0000 UTC m=+7133.142768411" lastFinishedPulling="2025-11-29 03:09:50.694832732 +0000 UTC m=+7133.866982609" observedRunningTime="2025-11-29 03:09:51.767112825 +0000 UTC m=+7134.939262692" watchObservedRunningTime="2025-11-29 03:09:51.773658884 +0000 UTC m=+7134.945808751" Nov 29 03:10:01 crc kubenswrapper[4749]: I1129 03:10:01.076233 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:10:01 crc kubenswrapper[4749]: E1129 03:10:01.077407 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:10:13 crc kubenswrapper[4749]: I1129 03:10:13.074990 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:10:13 crc kubenswrapper[4749]: E1129 03:10:13.075910 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:10:26 crc kubenswrapper[4749]: I1129 03:10:26.075552 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:10:26 crc kubenswrapper[4749]: E1129 03:10:26.076298 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:10:41 crc kubenswrapper[4749]: I1129 03:10:41.075905 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:10:41 crc kubenswrapper[4749]: E1129 03:10:41.076823 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:10:52 crc kubenswrapper[4749]: I1129 03:10:52.075400 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:10:52 crc kubenswrapper[4749]: E1129 03:10:52.076405 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:11:06 crc kubenswrapper[4749]: I1129 03:11:06.074979 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:11:06 crc kubenswrapper[4749]: E1129 03:11:06.076153 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:11:20 crc kubenswrapper[4749]: I1129 03:11:20.075262 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:11:20 crc kubenswrapper[4749]: E1129 03:11:20.076247 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:11:32 crc kubenswrapper[4749]: I1129 03:11:32.075488 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:11:32 crc kubenswrapper[4749]: E1129 03:11:32.077340 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:11:43 crc kubenswrapper[4749]: I1129 03:11:43.790298 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:11:43 crc kubenswrapper[4749]: E1129 03:11:43.791365 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:11:56 crc kubenswrapper[4749]: I1129 03:11:56.075704 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:11:56 crc kubenswrapper[4749]: E1129 03:11:56.076975 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:12:07 crc kubenswrapper[4749]: I1129 03:12:07.087882 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:12:07 crc kubenswrapper[4749]: E1129 03:12:07.089035 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.594911 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nvsdn"] Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.598539 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.612440 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nvsdn"] Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.790062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8zg\" (UniqueName: \"kubernetes.io/projected/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-kube-api-access-mw8zg\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.790482 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-catalog-content\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.790630 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-utilities\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.893158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-utilities\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.893321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8zg\" (UniqueName: \"kubernetes.io/projected/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-kube-api-access-mw8zg\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.893429 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-catalog-content\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.893685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-utilities\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.893729 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-catalog-content\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.913921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8zg\" (UniqueName: \"kubernetes.io/projected/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-kube-api-access-mw8zg\") pod \"redhat-operators-nvsdn\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:19 crc kubenswrapper[4749]: I1129 03:12:19.936530 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:20 crc kubenswrapper[4749]: I1129 03:12:20.396659 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nvsdn"] Nov 29 03:12:21 crc kubenswrapper[4749]: I1129 03:12:21.074922 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:12:21 crc kubenswrapper[4749]: E1129 03:12:21.075620 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:12:21 crc kubenswrapper[4749]: I1129 03:12:21.294595 4749 generic.go:334] "Generic (PLEG): container finished" podID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerID="70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65" exitCode=0 Nov 29 03:12:21 crc kubenswrapper[4749]: I1129 03:12:21.294701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvsdn" event={"ID":"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac","Type":"ContainerDied","Data":"70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65"} Nov 29 03:12:21 crc kubenswrapper[4749]: I1129 03:12:21.294752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvsdn" event={"ID":"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac","Type":"ContainerStarted","Data":"d530f47d00d553d794a1cf78cb73baa1339946b3837debf9d337fc96ce5abe09"} Nov 29 03:12:21 crc kubenswrapper[4749]: I1129 03:12:21.299916 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:12:22 crc kubenswrapper[4749]: I1129 03:12:22.307724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvsdn" event={"ID":"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac","Type":"ContainerStarted","Data":"2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517"} Nov 29 03:12:26 crc kubenswrapper[4749]: I1129 03:12:26.349049 4749 generic.go:334] "Generic (PLEG): container finished" podID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerID="2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517" exitCode=0 Nov 29 03:12:26 crc kubenswrapper[4749]: I1129 03:12:26.349110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvsdn" event={"ID":"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac","Type":"ContainerDied","Data":"2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517"} Nov 29 03:12:27 crc kubenswrapper[4749]: I1129 03:12:27.375182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvsdn" event={"ID":"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac","Type":"ContainerStarted","Data":"74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55"} Nov 29 03:12:27 crc kubenswrapper[4749]: I1129 03:12:27.410629 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nvsdn" podStartSLOduration=2.7932492140000003 podStartE2EDuration="8.410585884s" podCreationTimestamp="2025-11-29 03:12:19 +0000 UTC" firstStartedPulling="2025-11-29 03:12:21.299554594 +0000 UTC m=+7284.471704451" lastFinishedPulling="2025-11-29 03:12:26.916891224 +0000 UTC m=+7290.089041121" observedRunningTime="2025-11-29 03:12:27.39805919 +0000 UTC m=+7290.570209057" watchObservedRunningTime="2025-11-29 03:12:27.410585884 +0000 UTC m=+7290.582735751" Nov 29 03:12:29 crc kubenswrapper[4749]: I1129 03:12:29.936928 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:29 crc kubenswrapper[4749]: I1129 03:12:29.937284 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:31 crc kubenswrapper[4749]: I1129 03:12:31.006053 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nvsdn" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="registry-server" probeResult="failure" output=< Nov 29 03:12:31 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 03:12:31 crc kubenswrapper[4749]: > Nov 29 03:12:34 crc kubenswrapper[4749]: I1129 03:12:34.075139 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:12:34 crc kubenswrapper[4749]: E1129 03:12:34.077515 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:12:40 crc kubenswrapper[4749]: I1129 03:12:40.001736 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:40 crc kubenswrapper[4749]: I1129 03:12:40.079022 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:40 crc kubenswrapper[4749]: I1129 03:12:40.259830 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nvsdn"] Nov 29 03:12:41 crc kubenswrapper[4749]: I1129 03:12:41.536476 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nvsdn" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="registry-server" containerID="cri-o://74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55" gracePeriod=2 Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.175461 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.198588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw8zg\" (UniqueName: \"kubernetes.io/projected/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-kube-api-access-mw8zg\") pod \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.198703 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-catalog-content\") pod \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.198972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-utilities\") pod \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\" (UID: \"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac\") " Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.202245 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-utilities" (OuterVolumeSpecName: "utilities") pod "f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" (UID: "f0d6ec80-1ece-41b7-8b21-653bcf5b86ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.216527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-kube-api-access-mw8zg" (OuterVolumeSpecName: "kube-api-access-mw8zg") pod "f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" (UID: "f0d6ec80-1ece-41b7-8b21-653bcf5b86ac"). InnerVolumeSpecName "kube-api-access-mw8zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.324371 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.324422 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw8zg\" (UniqueName: \"kubernetes.io/projected/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-kube-api-access-mw8zg\") on node \"crc\" DevicePath \"\"" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.329115 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" (UID: "f0d6ec80-1ece-41b7-8b21-653bcf5b86ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.425901 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.546651 4749 generic.go:334] "Generic (PLEG): container finished" podID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerID="74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55" exitCode=0 Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.546693 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvsdn" event={"ID":"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac","Type":"ContainerDied","Data":"74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55"} Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.546718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nvsdn" event={"ID":"f0d6ec80-1ece-41b7-8b21-653bcf5b86ac","Type":"ContainerDied","Data":"d530f47d00d553d794a1cf78cb73baa1339946b3837debf9d337fc96ce5abe09"} Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.546734 4749 scope.go:117] "RemoveContainer" containerID="74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.546851 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nvsdn" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.579852 4749 scope.go:117] "RemoveContainer" containerID="2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.587369 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nvsdn"] Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.597240 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nvsdn"] Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.608365 4749 scope.go:117] "RemoveContainer" containerID="70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.685237 4749 scope.go:117] "RemoveContainer" containerID="74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55" Nov 29 03:12:42 crc kubenswrapper[4749]: E1129 03:12:42.685776 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55\": container with ID starting with 74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55 not found: ID does not exist" containerID="74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.685852 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55"} err="failed to get container status \"74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55\": rpc error: code = NotFound desc = could not find container \"74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55\": container with ID starting with 74d8ccc4dedb0be680bac5c08d5d1944822f9410954b1241658d5b56d036ce55 not found: ID does not exist" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.685885 4749 scope.go:117] "RemoveContainer" containerID="2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517" Nov 29 03:12:42 crc kubenswrapper[4749]: E1129 03:12:42.686448 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517\": container with ID starting with 2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517 not found: ID does not exist" containerID="2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.686494 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517"} err="failed to get container status \"2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517\": rpc error: code = NotFound desc = could not find container \"2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517\": container with ID starting with 2020d16ffa96109494ebd0a50ba7b9ed680d0b942fb75af5cf1815e7cbd36517 not found: ID does not exist" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.686517 4749 scope.go:117] "RemoveContainer" containerID="70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65" Nov 29 03:12:42 crc kubenswrapper[4749]: E1129 03:12:42.686766 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65\": container with ID starting with 70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65 not found: ID does not exist" containerID="70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65" Nov 29 03:12:42 crc kubenswrapper[4749]: I1129 03:12:42.686794 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65"} err="failed to get container status \"70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65\": rpc error: code = NotFound desc = could not find container \"70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65\": container with ID starting with 70ef5c99bc7731dd12c54a86effb3354df89683ed058bac887c172b9771d8f65 not found: ID does not exist" Nov 29 03:12:43 crc kubenswrapper[4749]: I1129 03:12:43.097310 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" path="/var/lib/kubelet/pods/f0d6ec80-1ece-41b7-8b21-653bcf5b86ac/volumes" Nov 29 03:12:47 crc kubenswrapper[4749]: I1129 03:12:47.084937 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:12:47 crc kubenswrapper[4749]: E1129 03:12:47.085772 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:12:59 crc kubenswrapper[4749]: I1129 03:12:59.075943 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:12:59 crc kubenswrapper[4749]: E1129 03:12:59.076923 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:13:06 crc kubenswrapper[4749]: I1129 03:13:06.857485 4749 generic.go:334] "Generic (PLEG): container finished" podID="aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" containerID="3cfba64a03abe8647dfb6656e3dd91c264828ca6b8d36200d0d688ee9d90c77d" exitCode=0 Nov 29 03:13:06 crc kubenswrapper[4749]: I1129 03:13:06.857648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" event={"ID":"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6","Type":"ContainerDied","Data":"3cfba64a03abe8647dfb6656e3dd91c264828ca6b8d36200d0d688ee9d90c77d"} Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.437717 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.470823 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl6rh\" (UniqueName: \"kubernetes.io/projected/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-kube-api-access-hl6rh\") pod \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.470875 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ceph\") pod \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.470904 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-inventory\") pod \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.471236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ssh-key\") pod \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.471295 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-bootstrap-combined-ca-bundle\") pod \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\" (UID: \"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6\") " Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.481385 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ceph" (OuterVolumeSpecName: "ceph") pod "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" (UID: "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.485414 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" (UID: "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.485953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-kube-api-access-hl6rh" (OuterVolumeSpecName: "kube-api-access-hl6rh") pod "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" (UID: "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6"). InnerVolumeSpecName "kube-api-access-hl6rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.520545 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-inventory" (OuterVolumeSpecName: "inventory") pod "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" (UID: "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.529295 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" (UID: "aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.573175 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.573225 4749 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.573238 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl6rh\" (UniqueName: \"kubernetes.io/projected/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-kube-api-access-hl6rh\") on node \"crc\" DevicePath \"\"" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.573248 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.573257 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.925537 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" event={"ID":"aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6","Type":"ContainerDied","Data":"5f2c3c3511d053fefcb89b2dcb4b57a7b1c5f0f1d37db3cb3d096a30c8d761c3"} Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.925599 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f2c3c3511d053fefcb89b2dcb4b57a7b1c5f0f1d37db3cb3d096a30c8d761c3" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.926354 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-d2xwc" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.996782 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-w4srh"] Nov 29 03:13:08 crc kubenswrapper[4749]: E1129 03:13:08.997526 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" containerName="bootstrap-openstack-openstack-cell1" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.997547 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" containerName="bootstrap-openstack-openstack-cell1" Nov 29 03:13:08 crc kubenswrapper[4749]: E1129 03:13:08.997564 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="extract-content" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.997572 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="extract-content" Nov 29 03:13:08 crc kubenswrapper[4749]: E1129 03:13:08.997585 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="extract-utilities" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.997593 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="extract-utilities" Nov 29 03:13:08 crc kubenswrapper[4749]: E1129 03:13:08.997625 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="registry-server" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.997633 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="registry-server" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.997878 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6" containerName="bootstrap-openstack-openstack-cell1" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.997893 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d6ec80-1ece-41b7-8b21-653bcf5b86ac" containerName="registry-server" Nov 29 03:13:08 crc kubenswrapper[4749]: I1129 03:13:08.998790 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.003662 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.003866 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.004353 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.004676 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.029467 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-w4srh"] Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.085091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-inventory\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.085146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.085223 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ceph\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.085329 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz6fh\" (UniqueName: \"kubernetes.io/projected/f5944a87-7112-4372-b615-59ae77bec28b-kube-api-access-hz6fh\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.187912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-inventory\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.187968 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.188001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ceph\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.188056 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz6fh\" (UniqueName: \"kubernetes.io/projected/f5944a87-7112-4372-b615-59ae77bec28b-kube-api-access-hz6fh\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.192435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ceph\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.193507 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-inventory\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.204730 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.210607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz6fh\" (UniqueName: \"kubernetes.io/projected/f5944a87-7112-4372-b615-59ae77bec28b-kube-api-access-hz6fh\") pod \"download-cache-openstack-openstack-cell1-w4srh\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.330422 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.728604 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-w4srh"] Nov 29 03:13:09 crc kubenswrapper[4749]: I1129 03:13:09.952391 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" event={"ID":"f5944a87-7112-4372-b615-59ae77bec28b","Type":"ContainerStarted","Data":"081c89eaec482d662974346883cd54cec9dfd41f0752750ea86e24ba9fe93bea"} Nov 29 03:13:10 crc kubenswrapper[4749]: I1129 03:13:10.965281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" event={"ID":"f5944a87-7112-4372-b615-59ae77bec28b","Type":"ContainerStarted","Data":"11754d00214429d8be56958a3f1858a2cef16f47dad2076835dd19fb3a692ab9"} Nov 29 03:13:10 crc kubenswrapper[4749]: I1129 03:13:10.990726 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" podStartSLOduration=2.442450585 podStartE2EDuration="2.99067098s" podCreationTimestamp="2025-11-29 03:13:08 +0000 UTC" firstStartedPulling="2025-11-29 03:13:09.741114922 +0000 UTC m=+7332.913264779" lastFinishedPulling="2025-11-29 03:13:10.289335317 +0000 UTC m=+7333.461485174" observedRunningTime="2025-11-29 03:13:10.981732953 +0000 UTC m=+7334.153882850" watchObservedRunningTime="2025-11-29 03:13:10.99067098 +0000 UTC m=+7334.162820867" Nov 29 03:13:11 crc kubenswrapper[4749]: I1129 03:13:11.075732 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:13:11 crc kubenswrapper[4749]: E1129 03:13:11.076041 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.059810 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nblww"] Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.064094 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.085061 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nblww"] Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.257845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rvkp\" (UniqueName: \"kubernetes.io/projected/50484b4e-7245-4004-a308-cb35cfaef96a-kube-api-access-7rvkp\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.258668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-utilities\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.258809 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-catalog-content\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.361477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-utilities\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.361720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-catalog-content\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.361892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rvkp\" (UniqueName: \"kubernetes.io/projected/50484b4e-7245-4004-a308-cb35cfaef96a-kube-api-access-7rvkp\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.362147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-utilities\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.362226 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-catalog-content\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.396216 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rvkp\" (UniqueName: \"kubernetes.io/projected/50484b4e-7245-4004-a308-cb35cfaef96a-kube-api-access-7rvkp\") pod \"community-operators-nblww\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.397085 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.942315 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nblww"] Nov 29 03:13:12 crc kubenswrapper[4749]: I1129 03:13:12.984878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nblww" event={"ID":"50484b4e-7245-4004-a308-cb35cfaef96a","Type":"ContainerStarted","Data":"0ee114f2f3f24e2f882aeced4d0a43cb563f2fb66d8086d26f3adebb6ae86948"} Nov 29 03:13:14 crc kubenswrapper[4749]: I1129 03:13:14.003316 4749 generic.go:334] "Generic (PLEG): container finished" podID="50484b4e-7245-4004-a308-cb35cfaef96a" containerID="24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e" exitCode=0 Nov 29 03:13:14 crc kubenswrapper[4749]: I1129 03:13:14.003429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nblww" event={"ID":"50484b4e-7245-4004-a308-cb35cfaef96a","Type":"ContainerDied","Data":"24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e"} Nov 29 03:13:16 crc kubenswrapper[4749]: I1129 03:13:16.034916 4749 generic.go:334] "Generic (PLEG): container finished" podID="50484b4e-7245-4004-a308-cb35cfaef96a" containerID="133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3" exitCode=0 Nov 29 03:13:16 crc kubenswrapper[4749]: I1129 03:13:16.035020 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nblww" event={"ID":"50484b4e-7245-4004-a308-cb35cfaef96a","Type":"ContainerDied","Data":"133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3"} Nov 29 03:13:17 crc kubenswrapper[4749]: I1129 03:13:17.051813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nblww" event={"ID":"50484b4e-7245-4004-a308-cb35cfaef96a","Type":"ContainerStarted","Data":"9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56"} Nov 29 03:13:17 crc kubenswrapper[4749]: I1129 03:13:17.081541 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nblww" podStartSLOduration=2.573068318 podStartE2EDuration="5.08152655s" podCreationTimestamp="2025-11-29 03:13:12 +0000 UTC" firstStartedPulling="2025-11-29 03:13:14.009670494 +0000 UTC m=+7337.181820391" lastFinishedPulling="2025-11-29 03:13:16.518128756 +0000 UTC m=+7339.690278623" observedRunningTime="2025-11-29 03:13:17.077478462 +0000 UTC m=+7340.249628319" watchObservedRunningTime="2025-11-29 03:13:17.08152655 +0000 UTC m=+7340.253676407" Nov 29 03:13:22 crc kubenswrapper[4749]: I1129 03:13:22.398417 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:22 crc kubenswrapper[4749]: I1129 03:13:22.398839 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:22 crc kubenswrapper[4749]: I1129 03:13:22.501451 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:23 crc kubenswrapper[4749]: I1129 03:13:23.184769 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:23 crc kubenswrapper[4749]: I1129 03:13:23.238431 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nblww"] Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.161229 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nblww" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" containerName="registry-server" containerID="cri-o://9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56" gracePeriod=2 Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.729515 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.792527 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-catalog-content\") pod \"50484b4e-7245-4004-a308-cb35cfaef96a\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.792704 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rvkp\" (UniqueName: \"kubernetes.io/projected/50484b4e-7245-4004-a308-cb35cfaef96a-kube-api-access-7rvkp\") pod \"50484b4e-7245-4004-a308-cb35cfaef96a\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.792795 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-utilities\") pod \"50484b4e-7245-4004-a308-cb35cfaef96a\" (UID: \"50484b4e-7245-4004-a308-cb35cfaef96a\") " Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.793598 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-utilities" (OuterVolumeSpecName: "utilities") pod "50484b4e-7245-4004-a308-cb35cfaef96a" (UID: "50484b4e-7245-4004-a308-cb35cfaef96a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.798189 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50484b4e-7245-4004-a308-cb35cfaef96a-kube-api-access-7rvkp" (OuterVolumeSpecName: "kube-api-access-7rvkp") pod "50484b4e-7245-4004-a308-cb35cfaef96a" (UID: "50484b4e-7245-4004-a308-cb35cfaef96a"). InnerVolumeSpecName "kube-api-access-7rvkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.895868 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rvkp\" (UniqueName: \"kubernetes.io/projected/50484b4e-7245-4004-a308-cb35cfaef96a-kube-api-access-7rvkp\") on node \"crc\" DevicePath \"\"" Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.895907 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.994002 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50484b4e-7245-4004-a308-cb35cfaef96a" (UID: "50484b4e-7245-4004-a308-cb35cfaef96a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:13:25 crc kubenswrapper[4749]: I1129 03:13:25.997699 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50484b4e-7245-4004-a308-cb35cfaef96a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.075157 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:13:26 crc kubenswrapper[4749]: E1129 03:13:26.075597 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.173134 4749 generic.go:334] "Generic (PLEG): container finished" podID="50484b4e-7245-4004-a308-cb35cfaef96a" containerID="9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56" exitCode=0 Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.173180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nblww" event={"ID":"50484b4e-7245-4004-a308-cb35cfaef96a","Type":"ContainerDied","Data":"9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56"} Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.173225 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nblww" event={"ID":"50484b4e-7245-4004-a308-cb35cfaef96a","Type":"ContainerDied","Data":"0ee114f2f3f24e2f882aeced4d0a43cb563f2fb66d8086d26f3adebb6ae86948"} Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.173243 4749 scope.go:117] "RemoveContainer" containerID="9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.173277 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nblww" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.204017 4749 scope.go:117] "RemoveContainer" containerID="133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.235066 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nblww"] Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.244119 4749 scope.go:117] "RemoveContainer" containerID="24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.250931 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nblww"] Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.298079 4749 scope.go:117] "RemoveContainer" containerID="9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56" Nov 29 03:13:26 crc kubenswrapper[4749]: E1129 03:13:26.298621 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56\": container with ID starting with 9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56 not found: ID does not exist" containerID="9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.298657 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56"} err="failed to get container status \"9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56\": rpc error: code = NotFound desc = could not find container \"9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56\": container with ID starting with 9fe42aa394a52e7f771ee8f2fbd1cd0d25faff8e9df177761f045b8b89bddd56 not found: ID does not exist" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.298685 4749 scope.go:117] "RemoveContainer" containerID="133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3" Nov 29 03:13:26 crc kubenswrapper[4749]: E1129 03:13:26.299107 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3\": container with ID starting with 133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3 not found: ID does not exist" containerID="133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.299154 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3"} err="failed to get container status \"133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3\": rpc error: code = NotFound desc = could not find container \"133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3\": container with ID starting with 133d0546fae9c551d76e5f375804a9813fcc09f06ad7b24994665c721fea55e3 not found: ID does not exist" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.299185 4749 scope.go:117] "RemoveContainer" containerID="24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e" Nov 29 03:13:26 crc kubenswrapper[4749]: E1129 03:13:26.299517 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e\": container with ID starting with 24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e not found: ID does not exist" containerID="24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e" Nov 29 03:13:26 crc kubenswrapper[4749]: I1129 03:13:26.299550 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e"} err="failed to get container status \"24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e\": rpc error: code = NotFound desc = could not find container \"24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e\": container with ID starting with 24106b76a1b20c38e27890f7c9c4f0f016c721c785680bad513bb49f5ce6cd0e not found: ID does not exist" Nov 29 03:13:27 crc kubenswrapper[4749]: I1129 03:13:27.096085 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" path="/var/lib/kubelet/pods/50484b4e-7245-4004-a308-cb35cfaef96a/volumes" Nov 29 03:13:38 crc kubenswrapper[4749]: I1129 03:13:38.076833 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:13:38 crc kubenswrapper[4749]: E1129 03:13:38.078049 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:13:51 crc kubenswrapper[4749]: I1129 03:13:51.075470 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:13:51 crc kubenswrapper[4749]: E1129 03:13:51.076685 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:14:03 crc kubenswrapper[4749]: I1129 03:14:03.075593 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:14:04 crc kubenswrapper[4749]: I1129 03:14:04.075816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"baca58002eea9c4ed127f05274b93052cce5e573c579199bacc6fbe8d465ca4c"} Nov 29 03:14:47 crc kubenswrapper[4749]: I1129 03:14:47.657042 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5944a87-7112-4372-b615-59ae77bec28b" containerID="11754d00214429d8be56958a3f1858a2cef16f47dad2076835dd19fb3a692ab9" exitCode=0 Nov 29 03:14:47 crc kubenswrapper[4749]: I1129 03:14:47.657220 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" event={"ID":"f5944a87-7112-4372-b615-59ae77bec28b","Type":"ContainerDied","Data":"11754d00214429d8be56958a3f1858a2cef16f47dad2076835dd19fb3a692ab9"} Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.256401 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.364709 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz6fh\" (UniqueName: \"kubernetes.io/projected/f5944a87-7112-4372-b615-59ae77bec28b-kube-api-access-hz6fh\") pod \"f5944a87-7112-4372-b615-59ae77bec28b\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.364774 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-inventory\") pod \"f5944a87-7112-4372-b615-59ae77bec28b\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.364939 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ceph\") pod \"f5944a87-7112-4372-b615-59ae77bec28b\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.364972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ssh-key\") pod \"f5944a87-7112-4372-b615-59ae77bec28b\" (UID: \"f5944a87-7112-4372-b615-59ae77bec28b\") " Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.370097 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5944a87-7112-4372-b615-59ae77bec28b-kube-api-access-hz6fh" (OuterVolumeSpecName: "kube-api-access-hz6fh") pod "f5944a87-7112-4372-b615-59ae77bec28b" (UID: "f5944a87-7112-4372-b615-59ae77bec28b"). InnerVolumeSpecName "kube-api-access-hz6fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.375408 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ceph" (OuterVolumeSpecName: "ceph") pod "f5944a87-7112-4372-b615-59ae77bec28b" (UID: "f5944a87-7112-4372-b615-59ae77bec28b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.392040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-inventory" (OuterVolumeSpecName: "inventory") pod "f5944a87-7112-4372-b615-59ae77bec28b" (UID: "f5944a87-7112-4372-b615-59ae77bec28b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.392829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f5944a87-7112-4372-b615-59ae77bec28b" (UID: "f5944a87-7112-4372-b615-59ae77bec28b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.467818 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz6fh\" (UniqueName: \"kubernetes.io/projected/f5944a87-7112-4372-b615-59ae77bec28b-kube-api-access-hz6fh\") on node \"crc\" DevicePath \"\"" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.467876 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.467896 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.467915 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f5944a87-7112-4372-b615-59ae77bec28b-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.685805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" event={"ID":"f5944a87-7112-4372-b615-59ae77bec28b","Type":"ContainerDied","Data":"081c89eaec482d662974346883cd54cec9dfd41f0752750ea86e24ba9fe93bea"} Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.686163 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="081c89eaec482d662974346883cd54cec9dfd41f0752750ea86e24ba9fe93bea" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.685876 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-w4srh" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.802545 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vkzv6"] Nov 29 03:14:49 crc kubenswrapper[4749]: E1129 03:14:49.803040 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" containerName="registry-server" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.803060 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" containerName="registry-server" Nov 29 03:14:49 crc kubenswrapper[4749]: E1129 03:14:49.803093 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" containerName="extract-utilities" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.803100 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" containerName="extract-utilities" Nov 29 03:14:49 crc kubenswrapper[4749]: E1129 03:14:49.803115 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5944a87-7112-4372-b615-59ae77bec28b" containerName="download-cache-openstack-openstack-cell1" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.803122 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5944a87-7112-4372-b615-59ae77bec28b" containerName="download-cache-openstack-openstack-cell1" Nov 29 03:14:49 crc kubenswrapper[4749]: E1129 03:14:49.803139 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" containerName="extract-content" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.803146 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" containerName="extract-content" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.803357 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5944a87-7112-4372-b615-59ae77bec28b" containerName="download-cache-openstack-openstack-cell1" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.803374 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="50484b4e-7245-4004-a308-cb35cfaef96a" containerName="registry-server" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.804159 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.807109 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.807324 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.807464 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.807581 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.817311 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vkzv6"] Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.876726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhzq\" (UniqueName: \"kubernetes.io/projected/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-kube-api-access-kjhzq\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.876804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-inventory\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.876903 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.876945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ceph\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.979127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.979189 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ceph\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.979278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhzq\" (UniqueName: \"kubernetes.io/projected/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-kube-api-access-kjhzq\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.979323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-inventory\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.984055 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.985190 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-inventory\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:49 crc kubenswrapper[4749]: I1129 03:14:49.992303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ceph\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:50 crc kubenswrapper[4749]: I1129 03:14:50.006589 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhzq\" (UniqueName: \"kubernetes.io/projected/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-kube-api-access-kjhzq\") pod \"configure-network-openstack-openstack-cell1-vkzv6\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:50 crc kubenswrapper[4749]: I1129 03:14:50.137472 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:14:50 crc kubenswrapper[4749]: I1129 03:14:50.798136 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vkzv6"] Nov 29 03:14:51 crc kubenswrapper[4749]: I1129 03:14:51.710990 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" event={"ID":"74859cb5-3819-4f6a-8eae-b82e47e0f7e4","Type":"ContainerStarted","Data":"bf89995bb0d73adb87eef52ad055e8b680667df2905cd337c74129ad9fdda84b"} Nov 29 03:14:52 crc kubenswrapper[4749]: I1129 03:14:52.725116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" event={"ID":"74859cb5-3819-4f6a-8eae-b82e47e0f7e4","Type":"ContainerStarted","Data":"b9c968ecd807046e91934e86ce2365ce13bb9e1eb932cccd12c761aed7877efa"} Nov 29 03:14:52 crc kubenswrapper[4749]: I1129 03:14:52.742464 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" podStartSLOduration=2.910133456 podStartE2EDuration="3.742445331s" podCreationTimestamp="2025-11-29 03:14:49 +0000 UTC" firstStartedPulling="2025-11-29 03:14:50.80189518 +0000 UTC m=+7433.974045037" lastFinishedPulling="2025-11-29 03:14:51.634207055 +0000 UTC m=+7434.806356912" observedRunningTime="2025-11-29 03:14:52.739608502 +0000 UTC m=+7435.911758369" watchObservedRunningTime="2025-11-29 03:14:52.742445331 +0000 UTC m=+7435.914595198" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.157295 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7"] Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.160040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.162642 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.165234 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.173123 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7"] Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.227347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxqp\" (UniqueName: \"kubernetes.io/projected/3803407e-f7af-4747-82ad-ecb8b23db732-kube-api-access-8fxqp\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.227406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3803407e-f7af-4747-82ad-ecb8b23db732-secret-volume\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.227789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3803407e-f7af-4747-82ad-ecb8b23db732-config-volume\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.330113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxqp\" (UniqueName: \"kubernetes.io/projected/3803407e-f7af-4747-82ad-ecb8b23db732-kube-api-access-8fxqp\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.330171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3803407e-f7af-4747-82ad-ecb8b23db732-secret-volume\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.330279 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3803407e-f7af-4747-82ad-ecb8b23db732-config-volume\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.331103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3803407e-f7af-4747-82ad-ecb8b23db732-config-volume\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.338102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3803407e-f7af-4747-82ad-ecb8b23db732-secret-volume\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.347136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxqp\" (UniqueName: \"kubernetes.io/projected/3803407e-f7af-4747-82ad-ecb8b23db732-kube-api-access-8fxqp\") pod \"collect-profiles-29406435-cqtm7\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:00 crc kubenswrapper[4749]: I1129 03:15:00.484364 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:01 crc kubenswrapper[4749]: I1129 03:15:01.014567 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7"] Nov 29 03:15:01 crc kubenswrapper[4749]: I1129 03:15:01.829786 4749 generic.go:334] "Generic (PLEG): container finished" podID="3803407e-f7af-4747-82ad-ecb8b23db732" containerID="fc68129932f81586db52b11e911281fb3fe4ceb4e7cb7bb86578d5c97d2376cd" exitCode=0 Nov 29 03:15:01 crc kubenswrapper[4749]: I1129 03:15:01.829828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" event={"ID":"3803407e-f7af-4747-82ad-ecb8b23db732","Type":"ContainerDied","Data":"fc68129932f81586db52b11e911281fb3fe4ceb4e7cb7bb86578d5c97d2376cd"} Nov 29 03:15:01 crc kubenswrapper[4749]: I1129 03:15:01.830099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" event={"ID":"3803407e-f7af-4747-82ad-ecb8b23db732","Type":"ContainerStarted","Data":"032675661a62cd01a8406f772bebf45064bb32982eb0929541f4d30bda5a884d"} Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.307188 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.403664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3803407e-f7af-4747-82ad-ecb8b23db732-secret-volume\") pod \"3803407e-f7af-4747-82ad-ecb8b23db732\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.404366 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fxqp\" (UniqueName: \"kubernetes.io/projected/3803407e-f7af-4747-82ad-ecb8b23db732-kube-api-access-8fxqp\") pod \"3803407e-f7af-4747-82ad-ecb8b23db732\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.404695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3803407e-f7af-4747-82ad-ecb8b23db732-config-volume\") pod \"3803407e-f7af-4747-82ad-ecb8b23db732\" (UID: \"3803407e-f7af-4747-82ad-ecb8b23db732\") " Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.405364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3803407e-f7af-4747-82ad-ecb8b23db732-config-volume" (OuterVolumeSpecName: "config-volume") pod "3803407e-f7af-4747-82ad-ecb8b23db732" (UID: "3803407e-f7af-4747-82ad-ecb8b23db732"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.406056 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3803407e-f7af-4747-82ad-ecb8b23db732-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.409229 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3803407e-f7af-4747-82ad-ecb8b23db732-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3803407e-f7af-4747-82ad-ecb8b23db732" (UID: "3803407e-f7af-4747-82ad-ecb8b23db732"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.412498 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3803407e-f7af-4747-82ad-ecb8b23db732-kube-api-access-8fxqp" (OuterVolumeSpecName: "kube-api-access-8fxqp") pod "3803407e-f7af-4747-82ad-ecb8b23db732" (UID: "3803407e-f7af-4747-82ad-ecb8b23db732"). InnerVolumeSpecName "kube-api-access-8fxqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.507824 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fxqp\" (UniqueName: \"kubernetes.io/projected/3803407e-f7af-4747-82ad-ecb8b23db732-kube-api-access-8fxqp\") on node \"crc\" DevicePath \"\"" Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.507858 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3803407e-f7af-4747-82ad-ecb8b23db732-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.855860 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.855822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7" event={"ID":"3803407e-f7af-4747-82ad-ecb8b23db732","Type":"ContainerDied","Data":"032675661a62cd01a8406f772bebf45064bb32982eb0929541f4d30bda5a884d"} Nov 29 03:15:03 crc kubenswrapper[4749]: I1129 03:15:03.855923 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="032675661a62cd01a8406f772bebf45064bb32982eb0929541f4d30bda5a884d" Nov 29 03:15:04 crc kubenswrapper[4749]: I1129 03:15:04.380308 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh"] Nov 29 03:15:04 crc kubenswrapper[4749]: I1129 03:15:04.388112 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406390-mvqkh"] Nov 29 03:15:05 crc kubenswrapper[4749]: I1129 03:15:05.097897 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1cf6c0f-36d8-4b64-89ae-55d85218f65a" path="/var/lib/kubelet/pods/f1cf6c0f-36d8-4b64-89ae-55d85218f65a/volumes" Nov 29 03:15:26 crc kubenswrapper[4749]: I1129 03:15:26.302594 4749 scope.go:117] "RemoveContainer" containerID="b2d0afbb1d7a1a3ffbaf6b6a7bb0c632e68165c992edc50dafd45ec5213467ee" Nov 29 03:16:18 crc kubenswrapper[4749]: I1129 03:16:18.740967 4749 generic.go:334] "Generic (PLEG): container finished" podID="74859cb5-3819-4f6a-8eae-b82e47e0f7e4" containerID="b9c968ecd807046e91934e86ce2365ce13bb9e1eb932cccd12c761aed7877efa" exitCode=0 Nov 29 03:16:18 crc kubenswrapper[4749]: I1129 03:16:18.741424 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" event={"ID":"74859cb5-3819-4f6a-8eae-b82e47e0f7e4","Type":"ContainerDied","Data":"b9c968ecd807046e91934e86ce2365ce13bb9e1eb932cccd12c761aed7877efa"} Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.303444 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.439268 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ceph\") pod \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.451766 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-inventory\") pod \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.451831 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ssh-key\") pod \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.451864 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjhzq\" (UniqueName: \"kubernetes.io/projected/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-kube-api-access-kjhzq\") pod \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\" (UID: \"74859cb5-3819-4f6a-8eae-b82e47e0f7e4\") " Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.457266 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-kube-api-access-kjhzq" (OuterVolumeSpecName: "kube-api-access-kjhzq") pod "74859cb5-3819-4f6a-8eae-b82e47e0f7e4" (UID: "74859cb5-3819-4f6a-8eae-b82e47e0f7e4"). InnerVolumeSpecName "kube-api-access-kjhzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.460364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ceph" (OuterVolumeSpecName: "ceph") pod "74859cb5-3819-4f6a-8eae-b82e47e0f7e4" (UID: "74859cb5-3819-4f6a-8eae-b82e47e0f7e4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.491178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74859cb5-3819-4f6a-8eae-b82e47e0f7e4" (UID: "74859cb5-3819-4f6a-8eae-b82e47e0f7e4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.491940 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-inventory" (OuterVolumeSpecName: "inventory") pod "74859cb5-3819-4f6a-8eae-b82e47e0f7e4" (UID: "74859cb5-3819-4f6a-8eae-b82e47e0f7e4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.554677 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.554715 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.554731 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjhzq\" (UniqueName: \"kubernetes.io/projected/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-kube-api-access-kjhzq\") on node \"crc\" DevicePath \"\"" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.554744 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/74859cb5-3819-4f6a-8eae-b82e47e0f7e4-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.765689 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" event={"ID":"74859cb5-3819-4f6a-8eae-b82e47e0f7e4","Type":"ContainerDied","Data":"bf89995bb0d73adb87eef52ad055e8b680667df2905cd337c74129ad9fdda84b"} Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.765750 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf89995bb0d73adb87eef52ad055e8b680667df2905cd337c74129ad9fdda84b" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.765777 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vkzv6" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.867192 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-5t9tc"] Nov 29 03:16:20 crc kubenswrapper[4749]: E1129 03:16:20.867714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3803407e-f7af-4747-82ad-ecb8b23db732" containerName="collect-profiles" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.867735 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3803407e-f7af-4747-82ad-ecb8b23db732" containerName="collect-profiles" Nov 29 03:16:20 crc kubenswrapper[4749]: E1129 03:16:20.867776 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74859cb5-3819-4f6a-8eae-b82e47e0f7e4" containerName="configure-network-openstack-openstack-cell1" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.867788 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="74859cb5-3819-4f6a-8eae-b82e47e0f7e4" containerName="configure-network-openstack-openstack-cell1" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.868074 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="74859cb5-3819-4f6a-8eae-b82e47e0f7e4" containerName="configure-network-openstack-openstack-cell1" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.868136 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3803407e-f7af-4747-82ad-ecb8b23db732" containerName="collect-profiles" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.869257 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.871838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.872215 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.872377 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.872501 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.880906 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-5t9tc"] Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.965086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpdpz\" (UniqueName: \"kubernetes.io/projected/8a0dca15-f50b-4ac0-9d64-052462449692-kube-api-access-tpdpz\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.965321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ssh-key\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.965532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-inventory\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:20 crc kubenswrapper[4749]: I1129 03:16:20.965578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ceph\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.067715 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ssh-key\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.067788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-inventory\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.067825 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ceph\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.067984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpdpz\" (UniqueName: \"kubernetes.io/projected/8a0dca15-f50b-4ac0-9d64-052462449692-kube-api-access-tpdpz\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.074712 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ceph\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.074762 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ssh-key\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.075141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-inventory\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.094088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpdpz\" (UniqueName: \"kubernetes.io/projected/8a0dca15-f50b-4ac0-9d64-052462449692-kube-api-access-tpdpz\") pod \"validate-network-openstack-openstack-cell1-5t9tc\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.258964 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:21 crc kubenswrapper[4749]: I1129 03:16:21.929478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-5t9tc"] Nov 29 03:16:22 crc kubenswrapper[4749]: I1129 03:16:22.787788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" event={"ID":"8a0dca15-f50b-4ac0-9d64-052462449692","Type":"ContainerStarted","Data":"44260552ec556cfda64969cf3da45e72812b1c4ee1710fe44f53030b8c7e6bf5"} Nov 29 03:16:22 crc kubenswrapper[4749]: I1129 03:16:22.788080 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" event={"ID":"8a0dca15-f50b-4ac0-9d64-052462449692","Type":"ContainerStarted","Data":"81e01bc09a0f6dd090094d648d687535ed6137e4153398ec56e9fe7a3b514280"} Nov 29 03:16:22 crc kubenswrapper[4749]: I1129 03:16:22.812741 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" podStartSLOduration=2.357578099 podStartE2EDuration="2.812720113s" podCreationTimestamp="2025-11-29 03:16:20 +0000 UTC" firstStartedPulling="2025-11-29 03:16:21.926639633 +0000 UTC m=+7525.098789490" lastFinishedPulling="2025-11-29 03:16:22.381781647 +0000 UTC m=+7525.553931504" observedRunningTime="2025-11-29 03:16:22.80516297 +0000 UTC m=+7525.977312857" watchObservedRunningTime="2025-11-29 03:16:22.812720113 +0000 UTC m=+7525.984869970" Nov 29 03:16:25 crc kubenswrapper[4749]: I1129 03:16:25.374717 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:16:25 crc kubenswrapper[4749]: I1129 03:16:25.375536 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:16:27 crc kubenswrapper[4749]: I1129 03:16:27.845519 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a0dca15-f50b-4ac0-9d64-052462449692" containerID="44260552ec556cfda64969cf3da45e72812b1c4ee1710fe44f53030b8c7e6bf5" exitCode=0 Nov 29 03:16:27 crc kubenswrapper[4749]: I1129 03:16:27.845714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" event={"ID":"8a0dca15-f50b-4ac0-9d64-052462449692","Type":"ContainerDied","Data":"44260552ec556cfda64969cf3da45e72812b1c4ee1710fe44f53030b8c7e6bf5"} Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.547152 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.691046 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpdpz\" (UniqueName: \"kubernetes.io/projected/8a0dca15-f50b-4ac0-9d64-052462449692-kube-api-access-tpdpz\") pod \"8a0dca15-f50b-4ac0-9d64-052462449692\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.691120 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ceph\") pod \"8a0dca15-f50b-4ac0-9d64-052462449692\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.691322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ssh-key\") pod \"8a0dca15-f50b-4ac0-9d64-052462449692\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.691565 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-inventory\") pod \"8a0dca15-f50b-4ac0-9d64-052462449692\" (UID: \"8a0dca15-f50b-4ac0-9d64-052462449692\") " Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.699466 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ceph" (OuterVolumeSpecName: "ceph") pod "8a0dca15-f50b-4ac0-9d64-052462449692" (UID: "8a0dca15-f50b-4ac0-9d64-052462449692"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.700364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0dca15-f50b-4ac0-9d64-052462449692-kube-api-access-tpdpz" (OuterVolumeSpecName: "kube-api-access-tpdpz") pod "8a0dca15-f50b-4ac0-9d64-052462449692" (UID: "8a0dca15-f50b-4ac0-9d64-052462449692"). InnerVolumeSpecName "kube-api-access-tpdpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.727950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-inventory" (OuterVolumeSpecName: "inventory") pod "8a0dca15-f50b-4ac0-9d64-052462449692" (UID: "8a0dca15-f50b-4ac0-9d64-052462449692"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.743892 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8a0dca15-f50b-4ac0-9d64-052462449692" (UID: "8a0dca15-f50b-4ac0-9d64-052462449692"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.807831 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.807880 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpdpz\" (UniqueName: \"kubernetes.io/projected/8a0dca15-f50b-4ac0-9d64-052462449692-kube-api-access-tpdpz\") on node \"crc\" DevicePath \"\"" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.807905 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.807920 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a0dca15-f50b-4ac0-9d64-052462449692-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.870500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" event={"ID":"8a0dca15-f50b-4ac0-9d64-052462449692","Type":"ContainerDied","Data":"81e01bc09a0f6dd090094d648d687535ed6137e4153398ec56e9fe7a3b514280"} Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.870548 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81e01bc09a0f6dd090094d648d687535ed6137e4153398ec56e9fe7a3b514280" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.870557 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-5t9tc" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.984569 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tm2vg"] Nov 29 03:16:29 crc kubenswrapper[4749]: E1129 03:16:29.984990 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0dca15-f50b-4ac0-9d64-052462449692" containerName="validate-network-openstack-openstack-cell1" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.985004 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0dca15-f50b-4ac0-9d64-052462449692" containerName="validate-network-openstack-openstack-cell1" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.985235 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0dca15-f50b-4ac0-9d64-052462449692" containerName="validate-network-openstack-openstack-cell1" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.985925 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.988793 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.989026 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.989128 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:16:29 crc kubenswrapper[4749]: I1129 03:16:29.990183 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.008081 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tm2vg"] Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.114222 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ceph\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.114621 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-inventory\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.114683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt8hl\" (UniqueName: \"kubernetes.io/projected/79934424-e3be-4b95-843c-65b7f2bcb76f-kube-api-access-nt8hl\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.114817 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ssh-key\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.217044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ceph\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.217179 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-inventory\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.217223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt8hl\" (UniqueName: \"kubernetes.io/projected/79934424-e3be-4b95-843c-65b7f2bcb76f-kube-api-access-nt8hl\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.217264 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ssh-key\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.220620 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ssh-key\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.220871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ceph\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.223665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-inventory\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.232268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt8hl\" (UniqueName: \"kubernetes.io/projected/79934424-e3be-4b95-843c-65b7f2bcb76f-kube-api-access-nt8hl\") pod \"install-os-openstack-openstack-cell1-tm2vg\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.308638 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.835034 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tm2vg"] Nov 29 03:16:30 crc kubenswrapper[4749]: W1129 03:16:30.838036 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79934424_e3be_4b95_843c_65b7f2bcb76f.slice/crio-c80635c48fe19e7fb9ac32e8d0a32f8b202c3d17022eb9f9c3639f19ca332c2e WatchSource:0}: Error finding container c80635c48fe19e7fb9ac32e8d0a32f8b202c3d17022eb9f9c3639f19ca332c2e: Status 404 returned error can't find the container with id c80635c48fe19e7fb9ac32e8d0a32f8b202c3d17022eb9f9c3639f19ca332c2e Nov 29 03:16:30 crc kubenswrapper[4749]: I1129 03:16:30.877939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" event={"ID":"79934424-e3be-4b95-843c-65b7f2bcb76f","Type":"ContainerStarted","Data":"c80635c48fe19e7fb9ac32e8d0a32f8b202c3d17022eb9f9c3639f19ca332c2e"} Nov 29 03:16:31 crc kubenswrapper[4749]: I1129 03:16:31.888527 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" event={"ID":"79934424-e3be-4b95-843c-65b7f2bcb76f","Type":"ContainerStarted","Data":"b96cc3a109801d717d83336e1fcd96f56be2411963b22bf4f1e1507d3cb8f72a"} Nov 29 03:16:31 crc kubenswrapper[4749]: I1129 03:16:31.909076 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" podStartSLOduration=2.452248124 podStartE2EDuration="2.909059727s" podCreationTimestamp="2025-11-29 03:16:29 +0000 UTC" firstStartedPulling="2025-11-29 03:16:30.840824684 +0000 UTC m=+7534.012974531" lastFinishedPulling="2025-11-29 03:16:31.297636277 +0000 UTC m=+7534.469786134" observedRunningTime="2025-11-29 03:16:31.902894328 +0000 UTC m=+7535.075044185" watchObservedRunningTime="2025-11-29 03:16:31.909059727 +0000 UTC m=+7535.081209584" Nov 29 03:16:55 crc kubenswrapper[4749]: I1129 03:16:55.373797 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:16:55 crc kubenswrapper[4749]: I1129 03:16:55.374464 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:17:18 crc kubenswrapper[4749]: I1129 03:17:18.434819 4749 generic.go:334] "Generic (PLEG): container finished" podID="79934424-e3be-4b95-843c-65b7f2bcb76f" containerID="b96cc3a109801d717d83336e1fcd96f56be2411963b22bf4f1e1507d3cb8f72a" exitCode=0 Nov 29 03:17:18 crc kubenswrapper[4749]: I1129 03:17:18.434970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" event={"ID":"79934424-e3be-4b95-843c-65b7f2bcb76f","Type":"ContainerDied","Data":"b96cc3a109801d717d83336e1fcd96f56be2411963b22bf4f1e1507d3cb8f72a"} Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.038391 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.120232 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ssh-key\") pod \"79934424-e3be-4b95-843c-65b7f2bcb76f\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.120293 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ceph\") pod \"79934424-e3be-4b95-843c-65b7f2bcb76f\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.120415 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt8hl\" (UniqueName: \"kubernetes.io/projected/79934424-e3be-4b95-843c-65b7f2bcb76f-kube-api-access-nt8hl\") pod \"79934424-e3be-4b95-843c-65b7f2bcb76f\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.120586 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-inventory\") pod \"79934424-e3be-4b95-843c-65b7f2bcb76f\" (UID: \"79934424-e3be-4b95-843c-65b7f2bcb76f\") " Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.130881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ceph" (OuterVolumeSpecName: "ceph") pod "79934424-e3be-4b95-843c-65b7f2bcb76f" (UID: "79934424-e3be-4b95-843c-65b7f2bcb76f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.132073 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79934424-e3be-4b95-843c-65b7f2bcb76f-kube-api-access-nt8hl" (OuterVolumeSpecName: "kube-api-access-nt8hl") pod "79934424-e3be-4b95-843c-65b7f2bcb76f" (UID: "79934424-e3be-4b95-843c-65b7f2bcb76f"). InnerVolumeSpecName "kube-api-access-nt8hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.153407 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "79934424-e3be-4b95-843c-65b7f2bcb76f" (UID: "79934424-e3be-4b95-843c-65b7f2bcb76f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.167460 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-inventory" (OuterVolumeSpecName: "inventory") pod "79934424-e3be-4b95-843c-65b7f2bcb76f" (UID: "79934424-e3be-4b95-843c-65b7f2bcb76f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.222794 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.222832 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.222847 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79934424-e3be-4b95-843c-65b7f2bcb76f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.222859 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt8hl\" (UniqueName: \"kubernetes.io/projected/79934424-e3be-4b95-843c-65b7f2bcb76f-kube-api-access-nt8hl\") on node \"crc\" DevicePath \"\"" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.458867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" event={"ID":"79934424-e3be-4b95-843c-65b7f2bcb76f","Type":"ContainerDied","Data":"c80635c48fe19e7fb9ac32e8d0a32f8b202c3d17022eb9f9c3639f19ca332c2e"} Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.458948 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c80635c48fe19e7fb9ac32e8d0a32f8b202c3d17022eb9f9c3639f19ca332c2e" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.459040 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tm2vg" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.577945 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-gg88z"] Nov 29 03:17:20 crc kubenswrapper[4749]: E1129 03:17:20.578837 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79934424-e3be-4b95-843c-65b7f2bcb76f" containerName="install-os-openstack-openstack-cell1" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.578863 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="79934424-e3be-4b95-843c-65b7f2bcb76f" containerName="install-os-openstack-openstack-cell1" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.579216 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="79934424-e3be-4b95-843c-65b7f2bcb76f" containerName="install-os-openstack-openstack-cell1" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.580513 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.583612 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.583682 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.583814 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.584088 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.592866 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-gg88z"] Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.634728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-inventory\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.634804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hclnv\" (UniqueName: \"kubernetes.io/projected/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-kube-api-access-hclnv\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.634929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ssh-key\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.635068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ceph\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.737219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ceph\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.737296 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-inventory\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.737338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hclnv\" (UniqueName: \"kubernetes.io/projected/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-kube-api-access-hclnv\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.737411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ssh-key\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.741687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-inventory\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.741694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ssh-key\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.748685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ceph\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.754240 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hclnv\" (UniqueName: \"kubernetes.io/projected/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-kube-api-access-hclnv\") pod \"configure-os-openstack-openstack-cell1-gg88z\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:20 crc kubenswrapper[4749]: I1129 03:17:20.924990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:17:21 crc kubenswrapper[4749]: I1129 03:17:21.475288 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-gg88z"] Nov 29 03:17:21 crc kubenswrapper[4749]: W1129 03:17:21.478214 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aa7ec69_393b_44ce_90f2_4efb3812bbb9.slice/crio-ae83a5becc9fafbae4c61653c65a3d311c30ab5d501aebadbeb21960b3936a08 WatchSource:0}: Error finding container ae83a5becc9fafbae4c61653c65a3d311c30ab5d501aebadbeb21960b3936a08: Status 404 returned error can't find the container with id ae83a5becc9fafbae4c61653c65a3d311c30ab5d501aebadbeb21960b3936a08 Nov 29 03:17:21 crc kubenswrapper[4749]: I1129 03:17:21.482662 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:17:22 crc kubenswrapper[4749]: I1129 03:17:22.486318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" event={"ID":"8aa7ec69-393b-44ce-90f2-4efb3812bbb9","Type":"ContainerStarted","Data":"7623d0aae8d8cb10216d54652f0977734c3788f93c714a76b6fd6e1992fec8ab"} Nov 29 03:17:22 crc kubenswrapper[4749]: I1129 03:17:22.487072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" event={"ID":"8aa7ec69-393b-44ce-90f2-4efb3812bbb9","Type":"ContainerStarted","Data":"ae83a5becc9fafbae4c61653c65a3d311c30ab5d501aebadbeb21960b3936a08"} Nov 29 03:17:22 crc kubenswrapper[4749]: I1129 03:17:22.513446 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" podStartSLOduration=1.9743512490000001 podStartE2EDuration="2.513423492s" podCreationTimestamp="2025-11-29 03:17:20 +0000 UTC" firstStartedPulling="2025-11-29 03:17:21.482252747 +0000 UTC m=+7584.654402634" lastFinishedPulling="2025-11-29 03:17:22.02132501 +0000 UTC m=+7585.193474877" observedRunningTime="2025-11-29 03:17:22.503634734 +0000 UTC m=+7585.675784611" watchObservedRunningTime="2025-11-29 03:17:22.513423492 +0000 UTC m=+7585.685573369" Nov 29 03:17:25 crc kubenswrapper[4749]: I1129 03:17:25.374581 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:17:25 crc kubenswrapper[4749]: I1129 03:17:25.375081 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:17:25 crc kubenswrapper[4749]: I1129 03:17:25.375119 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:17:25 crc kubenswrapper[4749]: I1129 03:17:25.375598 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"baca58002eea9c4ed127f05274b93052cce5e573c579199bacc6fbe8d465ca4c"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:17:25 crc kubenswrapper[4749]: I1129 03:17:25.375645 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://baca58002eea9c4ed127f05274b93052cce5e573c579199bacc6fbe8d465ca4c" gracePeriod=600 Nov 29 03:17:26 crc kubenswrapper[4749]: I1129 03:17:26.536131 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="baca58002eea9c4ed127f05274b93052cce5e573c579199bacc6fbe8d465ca4c" exitCode=0 Nov 29 03:17:26 crc kubenswrapper[4749]: I1129 03:17:26.536227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"baca58002eea9c4ed127f05274b93052cce5e573c579199bacc6fbe8d465ca4c"} Nov 29 03:17:26 crc kubenswrapper[4749]: I1129 03:17:26.536705 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0"} Nov 29 03:17:26 crc kubenswrapper[4749]: I1129 03:17:26.536723 4749 scope.go:117] "RemoveContainer" containerID="9b9404a2ae1ab8805425f786aa45e0916710e8edcd429ce3540a02cbe4a3063f" Nov 29 03:17:38 crc kubenswrapper[4749]: I1129 03:17:38.931593 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bbjdx"] Nov 29 03:17:38 crc kubenswrapper[4749]: I1129 03:17:38.937759 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:38 crc kubenswrapper[4749]: I1129 03:17:38.960061 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbjdx"] Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.054172 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-catalog-content\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.054427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-utilities\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.054735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65k4\" (UniqueName: \"kubernetes.io/projected/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-kube-api-access-w65k4\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.156244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-catalog-content\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.156374 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-utilities\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.156579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w65k4\" (UniqueName: \"kubernetes.io/projected/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-kube-api-access-w65k4\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.158849 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-catalog-content\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.159270 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-utilities\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.188906 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65k4\" (UniqueName: \"kubernetes.io/projected/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-kube-api-access-w65k4\") pod \"certified-operators-bbjdx\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.295711 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:39 crc kubenswrapper[4749]: I1129 03:17:39.824676 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbjdx"] Nov 29 03:17:39 crc kubenswrapper[4749]: W1129 03:17:39.830674 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5789fd1_cdc4_49a8_af97_6cc8c1351abc.slice/crio-1cede9f4556665e271ec675db4d7792dc9cceb7e5e3bea1e13367f2a3756b6a8 WatchSource:0}: Error finding container 1cede9f4556665e271ec675db4d7792dc9cceb7e5e3bea1e13367f2a3756b6a8: Status 404 returned error can't find the container with id 1cede9f4556665e271ec675db4d7792dc9cceb7e5e3bea1e13367f2a3756b6a8 Nov 29 03:17:40 crc kubenswrapper[4749]: I1129 03:17:40.713040 4749 generic.go:334] "Generic (PLEG): container finished" podID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerID="894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628" exitCode=0 Nov 29 03:17:40 crc kubenswrapper[4749]: I1129 03:17:40.713127 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjdx" event={"ID":"b5789fd1-cdc4-49a8-af97-6cc8c1351abc","Type":"ContainerDied","Data":"894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628"} Nov 29 03:17:40 crc kubenswrapper[4749]: I1129 03:17:40.713528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjdx" event={"ID":"b5789fd1-cdc4-49a8-af97-6cc8c1351abc","Type":"ContainerStarted","Data":"1cede9f4556665e271ec675db4d7792dc9cceb7e5e3bea1e13367f2a3756b6a8"} Nov 29 03:17:42 crc kubenswrapper[4749]: I1129 03:17:42.968285 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjdx" event={"ID":"b5789fd1-cdc4-49a8-af97-6cc8c1351abc","Type":"ContainerStarted","Data":"b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655"} Nov 29 03:17:43 crc kubenswrapper[4749]: I1129 03:17:43.984320 4749 generic.go:334] "Generic (PLEG): container finished" podID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerID="b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655" exitCode=0 Nov 29 03:17:43 crc kubenswrapper[4749]: I1129 03:17:43.984426 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjdx" event={"ID":"b5789fd1-cdc4-49a8-af97-6cc8c1351abc","Type":"ContainerDied","Data":"b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655"} Nov 29 03:17:44 crc kubenswrapper[4749]: I1129 03:17:44.998234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjdx" event={"ID":"b5789fd1-cdc4-49a8-af97-6cc8c1351abc","Type":"ContainerStarted","Data":"b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937"} Nov 29 03:17:45 crc kubenswrapper[4749]: I1129 03:17:45.019473 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bbjdx" podStartSLOduration=3.311888874 podStartE2EDuration="7.01945216s" podCreationTimestamp="2025-11-29 03:17:38 +0000 UTC" firstStartedPulling="2025-11-29 03:17:40.716457802 +0000 UTC m=+7603.888607659" lastFinishedPulling="2025-11-29 03:17:44.424021078 +0000 UTC m=+7607.596170945" observedRunningTime="2025-11-29 03:17:45.018327502 +0000 UTC m=+7608.190477399" watchObservedRunningTime="2025-11-29 03:17:45.01945216 +0000 UTC m=+7608.191602047" Nov 29 03:17:49 crc kubenswrapper[4749]: I1129 03:17:49.296333 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:49 crc kubenswrapper[4749]: I1129 03:17:49.296968 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:49 crc kubenswrapper[4749]: I1129 03:17:49.350484 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:50 crc kubenswrapper[4749]: I1129 03:17:50.123725 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:50 crc kubenswrapper[4749]: I1129 03:17:50.178621 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbjdx"] Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.115790 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bbjdx" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerName="registry-server" containerID="cri-o://b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937" gracePeriod=2 Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.629993 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.678028 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w65k4\" (UniqueName: \"kubernetes.io/projected/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-kube-api-access-w65k4\") pod \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.678158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-utilities\") pod \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.678226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-catalog-content\") pod \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\" (UID: \"b5789fd1-cdc4-49a8-af97-6cc8c1351abc\") " Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.679988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-utilities" (OuterVolumeSpecName: "utilities") pod "b5789fd1-cdc4-49a8-af97-6cc8c1351abc" (UID: "b5789fd1-cdc4-49a8-af97-6cc8c1351abc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.690943 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-kube-api-access-w65k4" (OuterVolumeSpecName: "kube-api-access-w65k4") pod "b5789fd1-cdc4-49a8-af97-6cc8c1351abc" (UID: "b5789fd1-cdc4-49a8-af97-6cc8c1351abc"). InnerVolumeSpecName "kube-api-access-w65k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.732390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5789fd1-cdc4-49a8-af97-6cc8c1351abc" (UID: "b5789fd1-cdc4-49a8-af97-6cc8c1351abc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.781441 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w65k4\" (UniqueName: \"kubernetes.io/projected/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-kube-api-access-w65k4\") on node \"crc\" DevicePath \"\"" Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.781484 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:17:52 crc kubenswrapper[4749]: I1129 03:17:52.781496 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5789fd1-cdc4-49a8-af97-6cc8c1351abc-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.126429 4749 generic.go:334] "Generic (PLEG): container finished" podID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerID="b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937" exitCode=0 Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.126497 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbjdx" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.126522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjdx" event={"ID":"b5789fd1-cdc4-49a8-af97-6cc8c1351abc","Type":"ContainerDied","Data":"b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937"} Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.126894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbjdx" event={"ID":"b5789fd1-cdc4-49a8-af97-6cc8c1351abc","Type":"ContainerDied","Data":"1cede9f4556665e271ec675db4d7792dc9cceb7e5e3bea1e13367f2a3756b6a8"} Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.126916 4749 scope.go:117] "RemoveContainer" containerID="b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.162924 4749 scope.go:117] "RemoveContainer" containerID="b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.169357 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbjdx"] Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.185074 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bbjdx"] Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.196567 4749 scope.go:117] "RemoveContainer" containerID="894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.251009 4749 scope.go:117] "RemoveContainer" containerID="b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937" Nov 29 03:17:53 crc kubenswrapper[4749]: E1129 03:17:53.251442 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937\": container with ID starting with b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937 not found: ID does not exist" containerID="b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.251491 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937"} err="failed to get container status \"b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937\": rpc error: code = NotFound desc = could not find container \"b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937\": container with ID starting with b056d5b75be6f1a0c399496c1070df4834da21a50a3648fa93bf1f30aca46937 not found: ID does not exist" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.251524 4749 scope.go:117] "RemoveContainer" containerID="b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655" Nov 29 03:17:53 crc kubenswrapper[4749]: E1129 03:17:53.251886 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655\": container with ID starting with b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655 not found: ID does not exist" containerID="b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.251918 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655"} err="failed to get container status \"b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655\": rpc error: code = NotFound desc = could not find container \"b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655\": container with ID starting with b5deed7e070adba6f0a04c1a339c39a3eceb47ffc296af4cceb4d292fd508655 not found: ID does not exist" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.251941 4749 scope.go:117] "RemoveContainer" containerID="894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628" Nov 29 03:17:53 crc kubenswrapper[4749]: E1129 03:17:53.252387 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628\": container with ID starting with 894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628 not found: ID does not exist" containerID="894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628" Nov 29 03:17:53 crc kubenswrapper[4749]: I1129 03:17:53.252413 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628"} err="failed to get container status \"894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628\": rpc error: code = NotFound desc = could not find container \"894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628\": container with ID starting with 894093bafbb4f15c19ae585069a3988fa3d7e591bfa3a1aefb7eac45f2b81628 not found: ID does not exist" Nov 29 03:17:55 crc kubenswrapper[4749]: I1129 03:17:55.092816 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" path="/var/lib/kubelet/pods/b5789fd1-cdc4-49a8-af97-6cc8c1351abc/volumes" Nov 29 03:18:07 crc kubenswrapper[4749]: I1129 03:18:07.316232 4749 generic.go:334] "Generic (PLEG): container finished" podID="8aa7ec69-393b-44ce-90f2-4efb3812bbb9" containerID="7623d0aae8d8cb10216d54652f0977734c3788f93c714a76b6fd6e1992fec8ab" exitCode=0 Nov 29 03:18:07 crc kubenswrapper[4749]: I1129 03:18:07.316864 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" event={"ID":"8aa7ec69-393b-44ce-90f2-4efb3812bbb9","Type":"ContainerDied","Data":"7623d0aae8d8cb10216d54652f0977734c3788f93c714a76b6fd6e1992fec8ab"} Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.850889 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.960730 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ceph\") pod \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.960842 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ssh-key\") pod \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.961511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hclnv\" (UniqueName: \"kubernetes.io/projected/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-kube-api-access-hclnv\") pod \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.961683 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-inventory\") pod \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\" (UID: \"8aa7ec69-393b-44ce-90f2-4efb3812bbb9\") " Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.966261 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-kube-api-access-hclnv" (OuterVolumeSpecName: "kube-api-access-hclnv") pod "8aa7ec69-393b-44ce-90f2-4efb3812bbb9" (UID: "8aa7ec69-393b-44ce-90f2-4efb3812bbb9"). InnerVolumeSpecName "kube-api-access-hclnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.966767 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ceph" (OuterVolumeSpecName: "ceph") pod "8aa7ec69-393b-44ce-90f2-4efb3812bbb9" (UID: "8aa7ec69-393b-44ce-90f2-4efb3812bbb9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.996869 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-inventory" (OuterVolumeSpecName: "inventory") pod "8aa7ec69-393b-44ce-90f2-4efb3812bbb9" (UID: "8aa7ec69-393b-44ce-90f2-4efb3812bbb9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:08 crc kubenswrapper[4749]: I1129 03:18:08.999028 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8aa7ec69-393b-44ce-90f2-4efb3812bbb9" (UID: "8aa7ec69-393b-44ce-90f2-4efb3812bbb9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.064839 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hclnv\" (UniqueName: \"kubernetes.io/projected/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-kube-api-access-hclnv\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.064878 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.064890 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.064903 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8aa7ec69-393b-44ce-90f2-4efb3812bbb9-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.346177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" event={"ID":"8aa7ec69-393b-44ce-90f2-4efb3812bbb9","Type":"ContainerDied","Data":"ae83a5becc9fafbae4c61653c65a3d311c30ab5d501aebadbeb21960b3936a08"} Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.346299 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae83a5becc9fafbae4c61653c65a3d311c30ab5d501aebadbeb21960b3936a08" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.346234 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-gg88z" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.435802 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-8mjpf"] Nov 29 03:18:09 crc kubenswrapper[4749]: E1129 03:18:09.436430 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aa7ec69-393b-44ce-90f2-4efb3812bbb9" containerName="configure-os-openstack-openstack-cell1" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.436452 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa7ec69-393b-44ce-90f2-4efb3812bbb9" containerName="configure-os-openstack-openstack-cell1" Nov 29 03:18:09 crc kubenswrapper[4749]: E1129 03:18:09.436469 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerName="extract-utilities" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.436479 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerName="extract-utilities" Nov 29 03:18:09 crc kubenswrapper[4749]: E1129 03:18:09.436513 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerName="registry-server" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.436522 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerName="registry-server" Nov 29 03:18:09 crc kubenswrapper[4749]: E1129 03:18:09.436551 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerName="extract-content" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.436558 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerName="extract-content" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.436825 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aa7ec69-393b-44ce-90f2-4efb3812bbb9" containerName="configure-os-openstack-openstack-cell1" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.436866 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5789fd1-cdc4-49a8-af97-6cc8c1351abc" containerName="registry-server" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.437856 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.443046 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.443416 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.443536 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.443697 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.446837 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-8mjpf"] Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.577679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q6p6\" (UniqueName: \"kubernetes.io/projected/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-kube-api-access-8q6p6\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.578004 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-inventory-0\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.578039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ceph\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.578108 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.680161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q6p6\" (UniqueName: \"kubernetes.io/projected/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-kube-api-access-8q6p6\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.680260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-inventory-0\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.680288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ceph\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.680343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.686914 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.687760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ceph\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.696291 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-inventory-0\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.703332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q6p6\" (UniqueName: \"kubernetes.io/projected/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-kube-api-access-8q6p6\") pod \"ssh-known-hosts-openstack-8mjpf\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:09 crc kubenswrapper[4749]: I1129 03:18:09.757470 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:10 crc kubenswrapper[4749]: I1129 03:18:10.352964 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-8mjpf"] Nov 29 03:18:11 crc kubenswrapper[4749]: I1129 03:18:11.370100 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-8mjpf" event={"ID":"1ef348e7-3d36-45b4-90f8-582d82bc0d4a","Type":"ContainerStarted","Data":"0d32cb50935ff4b1b020d358851448862f3e4c2f5e700fca08e33969588504a3"} Nov 29 03:18:12 crc kubenswrapper[4749]: I1129 03:18:12.379573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-8mjpf" event={"ID":"1ef348e7-3d36-45b4-90f8-582d82bc0d4a","Type":"ContainerStarted","Data":"64d909c76cc3966b78c4464703300d44fff7c6b42027608fabf08951bab39504"} Nov 29 03:18:12 crc kubenswrapper[4749]: I1129 03:18:12.414706 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-8mjpf" podStartSLOduration=2.149364849 podStartE2EDuration="3.41468744s" podCreationTimestamp="2025-11-29 03:18:09 +0000 UTC" firstStartedPulling="2025-11-29 03:18:10.358742167 +0000 UTC m=+7633.530892034" lastFinishedPulling="2025-11-29 03:18:11.624064728 +0000 UTC m=+7634.796214625" observedRunningTime="2025-11-29 03:18:12.413528732 +0000 UTC m=+7635.585678599" watchObservedRunningTime="2025-11-29 03:18:12.41468744 +0000 UTC m=+7635.586837307" Nov 29 03:18:21 crc kubenswrapper[4749]: I1129 03:18:21.498592 4749 generic.go:334] "Generic (PLEG): container finished" podID="1ef348e7-3d36-45b4-90f8-582d82bc0d4a" containerID="64d909c76cc3966b78c4464703300d44fff7c6b42027608fabf08951bab39504" exitCode=0 Nov 29 03:18:21 crc kubenswrapper[4749]: I1129 03:18:21.498735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-8mjpf" event={"ID":"1ef348e7-3d36-45b4-90f8-582d82bc0d4a","Type":"ContainerDied","Data":"64d909c76cc3966b78c4464703300d44fff7c6b42027608fabf08951bab39504"} Nov 29 03:18:22 crc kubenswrapper[4749]: I1129 03:18:22.960771 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.119719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-inventory-0\") pod \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.120119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ceph\") pod \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.120244 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ssh-key-openstack-cell1\") pod \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.120507 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q6p6\" (UniqueName: \"kubernetes.io/projected/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-kube-api-access-8q6p6\") pod \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\" (UID: \"1ef348e7-3d36-45b4-90f8-582d82bc0d4a\") " Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.129743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-kube-api-access-8q6p6" (OuterVolumeSpecName: "kube-api-access-8q6p6") pod "1ef348e7-3d36-45b4-90f8-582d82bc0d4a" (UID: "1ef348e7-3d36-45b4-90f8-582d82bc0d4a"). InnerVolumeSpecName "kube-api-access-8q6p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.130020 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ceph" (OuterVolumeSpecName: "ceph") pod "1ef348e7-3d36-45b4-90f8-582d82bc0d4a" (UID: "1ef348e7-3d36-45b4-90f8-582d82bc0d4a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.179069 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "1ef348e7-3d36-45b4-90f8-582d82bc0d4a" (UID: "1ef348e7-3d36-45b4-90f8-582d82bc0d4a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.184397 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "1ef348e7-3d36-45b4-90f8-582d82bc0d4a" (UID: "1ef348e7-3d36-45b4-90f8-582d82bc0d4a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.224053 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.224116 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.224151 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q6p6\" (UniqueName: \"kubernetes.io/projected/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-kube-api-access-8q6p6\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.224225 4749 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/1ef348e7-3d36-45b4-90f8-582d82bc0d4a-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.522456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-8mjpf" event={"ID":"1ef348e7-3d36-45b4-90f8-582d82bc0d4a","Type":"ContainerDied","Data":"0d32cb50935ff4b1b020d358851448862f3e4c2f5e700fca08e33969588504a3"} Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.522507 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-8mjpf" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.522512 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d32cb50935ff4b1b020d358851448862f3e4c2f5e700fca08e33969588504a3" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.645628 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wf5c7"] Nov 29 03:18:23 crc kubenswrapper[4749]: E1129 03:18:23.646050 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef348e7-3d36-45b4-90f8-582d82bc0d4a" containerName="ssh-known-hosts-openstack" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.646068 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef348e7-3d36-45b4-90f8-582d82bc0d4a" containerName="ssh-known-hosts-openstack" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.646289 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef348e7-3d36-45b4-90f8-582d82bc0d4a" containerName="ssh-known-hosts-openstack" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.646986 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.649950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.650551 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.650864 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.653040 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.676000 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wf5c7"] Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.734151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ssh-key\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.734473 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-inventory\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.734509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ceph\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.734530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n5pc\" (UniqueName: \"kubernetes.io/projected/0d96903e-8ce9-4112-abcf-0151817e99a8-kube-api-access-5n5pc\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.836796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ssh-key\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.836848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-inventory\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.836885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ceph\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.836905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n5pc\" (UniqueName: \"kubernetes.io/projected/0d96903e-8ce9-4112-abcf-0151817e99a8-kube-api-access-5n5pc\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.840841 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ssh-key\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.841410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ceph\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.841828 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-inventory\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.857071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n5pc\" (UniqueName: \"kubernetes.io/projected/0d96903e-8ce9-4112-abcf-0151817e99a8-kube-api-access-5n5pc\") pod \"run-os-openstack-openstack-cell1-wf5c7\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:23 crc kubenswrapper[4749]: I1129 03:18:23.965129 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:24 crc kubenswrapper[4749]: I1129 03:18:24.579889 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wf5c7"] Nov 29 03:18:25 crc kubenswrapper[4749]: I1129 03:18:25.544633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" event={"ID":"0d96903e-8ce9-4112-abcf-0151817e99a8","Type":"ContainerStarted","Data":"259d0c64148bb2e0eb05440a446f46b135f91a7c1ab999221c617429363afecd"} Nov 29 03:18:25 crc kubenswrapper[4749]: I1129 03:18:25.545305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" event={"ID":"0d96903e-8ce9-4112-abcf-0151817e99a8","Type":"ContainerStarted","Data":"7e6fe575c82f7e041535b2400f2baa8d216f98b90d40c3cc8b5712c345fd3bbc"} Nov 29 03:18:25 crc kubenswrapper[4749]: I1129 03:18:25.562790 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" podStartSLOduration=1.970193617 podStartE2EDuration="2.562761629s" podCreationTimestamp="2025-11-29 03:18:23 +0000 UTC" firstStartedPulling="2025-11-29 03:18:24.590357353 +0000 UTC m=+7647.762507210" lastFinishedPulling="2025-11-29 03:18:25.182925355 +0000 UTC m=+7648.355075222" observedRunningTime="2025-11-29 03:18:25.56196167 +0000 UTC m=+7648.734111607" watchObservedRunningTime="2025-11-29 03:18:25.562761629 +0000 UTC m=+7648.734911556" Nov 29 03:18:33 crc kubenswrapper[4749]: I1129 03:18:33.661718 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d96903e-8ce9-4112-abcf-0151817e99a8" containerID="259d0c64148bb2e0eb05440a446f46b135f91a7c1ab999221c617429363afecd" exitCode=0 Nov 29 03:18:33 crc kubenswrapper[4749]: I1129 03:18:33.662337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" event={"ID":"0d96903e-8ce9-4112-abcf-0151817e99a8","Type":"ContainerDied","Data":"259d0c64148bb2e0eb05440a446f46b135f91a7c1ab999221c617429363afecd"} Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.127460 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.208517 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ssh-key\") pod \"0d96903e-8ce9-4112-abcf-0151817e99a8\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.208637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n5pc\" (UniqueName: \"kubernetes.io/projected/0d96903e-8ce9-4112-abcf-0151817e99a8-kube-api-access-5n5pc\") pod \"0d96903e-8ce9-4112-abcf-0151817e99a8\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.208698 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ceph\") pod \"0d96903e-8ce9-4112-abcf-0151817e99a8\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.208767 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-inventory\") pod \"0d96903e-8ce9-4112-abcf-0151817e99a8\" (UID: \"0d96903e-8ce9-4112-abcf-0151817e99a8\") " Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.219448 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ceph" (OuterVolumeSpecName: "ceph") pod "0d96903e-8ce9-4112-abcf-0151817e99a8" (UID: "0d96903e-8ce9-4112-abcf-0151817e99a8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.227430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d96903e-8ce9-4112-abcf-0151817e99a8-kube-api-access-5n5pc" (OuterVolumeSpecName: "kube-api-access-5n5pc") pod "0d96903e-8ce9-4112-abcf-0151817e99a8" (UID: "0d96903e-8ce9-4112-abcf-0151817e99a8"). InnerVolumeSpecName "kube-api-access-5n5pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.252578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-inventory" (OuterVolumeSpecName: "inventory") pod "0d96903e-8ce9-4112-abcf-0151817e99a8" (UID: "0d96903e-8ce9-4112-abcf-0151817e99a8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.254338 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d96903e-8ce9-4112-abcf-0151817e99a8" (UID: "0d96903e-8ce9-4112-abcf-0151817e99a8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.315254 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.315282 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n5pc\" (UniqueName: \"kubernetes.io/projected/0d96903e-8ce9-4112-abcf-0151817e99a8-kube-api-access-5n5pc\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.315295 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.315303 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d96903e-8ce9-4112-abcf-0151817e99a8-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.689763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" event={"ID":"0d96903e-8ce9-4112-abcf-0151817e99a8","Type":"ContainerDied","Data":"7e6fe575c82f7e041535b2400f2baa8d216f98b90d40c3cc8b5712c345fd3bbc"} Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.689826 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e6fe575c82f7e041535b2400f2baa8d216f98b90d40c3cc8b5712c345fd3bbc" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.690404 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wf5c7" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.790396 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-s6bfv"] Nov 29 03:18:35 crc kubenswrapper[4749]: E1129 03:18:35.793487 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d96903e-8ce9-4112-abcf-0151817e99a8" containerName="run-os-openstack-openstack-cell1" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.793545 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d96903e-8ce9-4112-abcf-0151817e99a8" containerName="run-os-openstack-openstack-cell1" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.794010 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d96903e-8ce9-4112-abcf-0151817e99a8" containerName="run-os-openstack-openstack-cell1" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.795592 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.801279 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.800033 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.803790 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.810542 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-s6bfv"] Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.813719 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.931538 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.931778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-inventory\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.931855 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ceph\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:35 crc kubenswrapper[4749]: I1129 03:18:35.932096 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jl2d\" (UniqueName: \"kubernetes.io/projected/59ea7fe7-a726-4e18-bd71-11070ae29d0a-kube-api-access-2jl2d\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.034111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.034328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-inventory\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.034402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ceph\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.034532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jl2d\" (UniqueName: \"kubernetes.io/projected/59ea7fe7-a726-4e18-bd71-11070ae29d0a-kube-api-access-2jl2d\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.040051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ceph\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.040185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-inventory\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.050572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.055734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jl2d\" (UniqueName: \"kubernetes.io/projected/59ea7fe7-a726-4e18-bd71-11070ae29d0a-kube-api-access-2jl2d\") pod \"reboot-os-openstack-openstack-cell1-s6bfv\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.123934 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:36 crc kubenswrapper[4749]: I1129 03:18:36.803768 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-s6bfv"] Nov 29 03:18:37 crc kubenswrapper[4749]: I1129 03:18:37.718279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" event={"ID":"59ea7fe7-a726-4e18-bd71-11070ae29d0a","Type":"ContainerStarted","Data":"6d79cdd8fc044ca31fb447f8826203d68e2b8a3526424c5c6ce5d2a5f4bf928f"} Nov 29 03:18:37 crc kubenswrapper[4749]: I1129 03:18:37.718947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" event={"ID":"59ea7fe7-a726-4e18-bd71-11070ae29d0a","Type":"ContainerStarted","Data":"9cb057cd4bb77efc0107f8ea325d8b53559007c7bd262aabcbe0e619e7c65e5b"} Nov 29 03:18:37 crc kubenswrapper[4749]: I1129 03:18:37.744874 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" podStartSLOduration=2.184359554 podStartE2EDuration="2.744856778s" podCreationTimestamp="2025-11-29 03:18:35 +0000 UTC" firstStartedPulling="2025-11-29 03:18:36.813737473 +0000 UTC m=+7659.985887340" lastFinishedPulling="2025-11-29 03:18:37.374234697 +0000 UTC m=+7660.546384564" observedRunningTime="2025-11-29 03:18:37.740753218 +0000 UTC m=+7660.912903115" watchObservedRunningTime="2025-11-29 03:18:37.744856778 +0000 UTC m=+7660.917006635" Nov 29 03:18:54 crc kubenswrapper[4749]: I1129 03:18:54.040665 4749 generic.go:334] "Generic (PLEG): container finished" podID="59ea7fe7-a726-4e18-bd71-11070ae29d0a" containerID="6d79cdd8fc044ca31fb447f8826203d68e2b8a3526424c5c6ce5d2a5f4bf928f" exitCode=0 Nov 29 03:18:54 crc kubenswrapper[4749]: I1129 03:18:54.040767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" event={"ID":"59ea7fe7-a726-4e18-bd71-11070ae29d0a","Type":"ContainerDied","Data":"6d79cdd8fc044ca31fb447f8826203d68e2b8a3526424c5c6ce5d2a5f4bf928f"} Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.610133 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.795483 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jl2d\" (UniqueName: \"kubernetes.io/projected/59ea7fe7-a726-4e18-bd71-11070ae29d0a-kube-api-access-2jl2d\") pod \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.795569 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ceph\") pod \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.795627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-inventory\") pod \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.795654 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ssh-key\") pod \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\" (UID: \"59ea7fe7-a726-4e18-bd71-11070ae29d0a\") " Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.801208 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ea7fe7-a726-4e18-bd71-11070ae29d0a-kube-api-access-2jl2d" (OuterVolumeSpecName: "kube-api-access-2jl2d") pod "59ea7fe7-a726-4e18-bd71-11070ae29d0a" (UID: "59ea7fe7-a726-4e18-bd71-11070ae29d0a"). InnerVolumeSpecName "kube-api-access-2jl2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.802889 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ceph" (OuterVolumeSpecName: "ceph") pod "59ea7fe7-a726-4e18-bd71-11070ae29d0a" (UID: "59ea7fe7-a726-4e18-bd71-11070ae29d0a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.845739 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59ea7fe7-a726-4e18-bd71-11070ae29d0a" (UID: "59ea7fe7-a726-4e18-bd71-11070ae29d0a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.846582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-inventory" (OuterVolumeSpecName: "inventory") pod "59ea7fe7-a726-4e18-bd71-11070ae29d0a" (UID: "59ea7fe7-a726-4e18-bd71-11070ae29d0a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.898352 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jl2d\" (UniqueName: \"kubernetes.io/projected/59ea7fe7-a726-4e18-bd71-11070ae29d0a-kube-api-access-2jl2d\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.898387 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.898398 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:55 crc kubenswrapper[4749]: I1129 03:18:55.898407 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ea7fe7-a726-4e18-bd71-11070ae29d0a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.067949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" event={"ID":"59ea7fe7-a726-4e18-bd71-11070ae29d0a","Type":"ContainerDied","Data":"9cb057cd4bb77efc0107f8ea325d8b53559007c7bd262aabcbe0e619e7c65e5b"} Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.067994 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb057cd4bb77efc0107f8ea325d8b53559007c7bd262aabcbe0e619e7c65e5b" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.068022 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-s6bfv" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.174908 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-bjfr2"] Nov 29 03:18:56 crc kubenswrapper[4749]: E1129 03:18:56.175499 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ea7fe7-a726-4e18-bd71-11070ae29d0a" containerName="reboot-os-openstack-openstack-cell1" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.175526 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ea7fe7-a726-4e18-bd71-11070ae29d0a" containerName="reboot-os-openstack-openstack-cell1" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.175793 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ea7fe7-a726-4e18-bd71-11070ae29d0a" containerName="reboot-os-openstack-openstack-cell1" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.176815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.178681 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.180029 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.180043 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.180532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.204529 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-bjfr2"] Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.307759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.307822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.307856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-inventory\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.308076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z54v\" (UniqueName: \"kubernetes.io/projected/443e1ef6-779b-44ec-9f24-6a661a47a0a6-kube-api-access-8z54v\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.308156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.308240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.308264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ceph\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.308286 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.308520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.308891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.308945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ssh-key\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.309075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.410755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.410821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.410860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-inventory\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.410961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z54v\" (UniqueName: \"kubernetes.io/projected/443e1ef6-779b-44ec-9f24-6a661a47a0a6-kube-api-access-8z54v\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.411492 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.412029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.412077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ceph\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.412114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.412318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.412376 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.412453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ssh-key\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.412571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.416157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-inventory\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.416667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ceph\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.417061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.417895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.418106 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.419295 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.419954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.420686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.420922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.421248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.421274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ssh-key\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.430974 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z54v\" (UniqueName: \"kubernetes.io/projected/443e1ef6-779b-44ec-9f24-6a661a47a0a6-kube-api-access-8z54v\") pod \"install-certs-openstack-openstack-cell1-bjfr2\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:56 crc kubenswrapper[4749]: I1129 03:18:56.504700 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:18:57 crc kubenswrapper[4749]: W1129 03:18:57.170665 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod443e1ef6_779b_44ec_9f24_6a661a47a0a6.slice/crio-620d7728b5181cf7f5d150c07679585f3067395e9229f26471efa18cc6251aa4 WatchSource:0}: Error finding container 620d7728b5181cf7f5d150c07679585f3067395e9229f26471efa18cc6251aa4: Status 404 returned error can't find the container with id 620d7728b5181cf7f5d150c07679585f3067395e9229f26471efa18cc6251aa4 Nov 29 03:18:57 crc kubenswrapper[4749]: I1129 03:18:57.172230 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-bjfr2"] Nov 29 03:18:57 crc kubenswrapper[4749]: I1129 03:18:57.726568 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:18:58 crc kubenswrapper[4749]: I1129 03:18:58.112359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" event={"ID":"443e1ef6-779b-44ec-9f24-6a661a47a0a6","Type":"ContainerStarted","Data":"3f1112df385f1a43ec2b935c521f3ed5741c9010f4dbcb96f5a82371309028cb"} Nov 29 03:18:58 crc kubenswrapper[4749]: I1129 03:18:58.112402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" event={"ID":"443e1ef6-779b-44ec-9f24-6a661a47a0a6","Type":"ContainerStarted","Data":"620d7728b5181cf7f5d150c07679585f3067395e9229f26471efa18cc6251aa4"} Nov 29 03:18:58 crc kubenswrapper[4749]: I1129 03:18:58.135837 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" podStartSLOduration=1.5867502999999998 podStartE2EDuration="2.135817765s" podCreationTimestamp="2025-11-29 03:18:56 +0000 UTC" firstStartedPulling="2025-11-29 03:18:57.173277888 +0000 UTC m=+7680.345427745" lastFinishedPulling="2025-11-29 03:18:57.722345343 +0000 UTC m=+7680.894495210" observedRunningTime="2025-11-29 03:18:58.134275608 +0000 UTC m=+7681.306425525" watchObservedRunningTime="2025-11-29 03:18:58.135817765 +0000 UTC m=+7681.307967622" Nov 29 03:19:08 crc kubenswrapper[4749]: I1129 03:19:08.868635 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qpwh8"] Nov 29 03:19:08 crc kubenswrapper[4749]: I1129 03:19:08.874260 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:08 crc kubenswrapper[4749]: I1129 03:19:08.886403 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpwh8"] Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.061184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-utilities\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.061287 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd8m5\" (UniqueName: \"kubernetes.io/projected/0caca417-1cff-45bf-98a9-7d6dfd6c5331-kube-api-access-bd8m5\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.061646 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-catalog-content\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.164407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-catalog-content\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.164545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-utilities\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.164597 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8m5\" (UniqueName: \"kubernetes.io/projected/0caca417-1cff-45bf-98a9-7d6dfd6c5331-kube-api-access-bd8m5\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.165613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-utilities\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.165623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-catalog-content\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.197187 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8m5\" (UniqueName: \"kubernetes.io/projected/0caca417-1cff-45bf-98a9-7d6dfd6c5331-kube-api-access-bd8m5\") pod \"redhat-marketplace-qpwh8\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.217814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:09 crc kubenswrapper[4749]: I1129 03:19:09.712395 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpwh8"] Nov 29 03:19:10 crc kubenswrapper[4749]: I1129 03:19:10.285974 4749 generic.go:334] "Generic (PLEG): container finished" podID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerID="648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef" exitCode=0 Nov 29 03:19:10 crc kubenswrapper[4749]: I1129 03:19:10.286087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpwh8" event={"ID":"0caca417-1cff-45bf-98a9-7d6dfd6c5331","Type":"ContainerDied","Data":"648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef"} Nov 29 03:19:10 crc kubenswrapper[4749]: I1129 03:19:10.286385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpwh8" event={"ID":"0caca417-1cff-45bf-98a9-7d6dfd6c5331","Type":"ContainerStarted","Data":"f325fc57dc416a549e0a0798bf1338f12b458c059f7befc06bd413a6332f895b"} Nov 29 03:19:12 crc kubenswrapper[4749]: I1129 03:19:12.321566 4749 generic.go:334] "Generic (PLEG): container finished" podID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerID="7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d" exitCode=0 Nov 29 03:19:12 crc kubenswrapper[4749]: I1129 03:19:12.321664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpwh8" event={"ID":"0caca417-1cff-45bf-98a9-7d6dfd6c5331","Type":"ContainerDied","Data":"7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d"} Nov 29 03:19:13 crc kubenswrapper[4749]: I1129 03:19:13.341691 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpwh8" event={"ID":"0caca417-1cff-45bf-98a9-7d6dfd6c5331","Type":"ContainerStarted","Data":"471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a"} Nov 29 03:19:13 crc kubenswrapper[4749]: I1129 03:19:13.369992 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qpwh8" podStartSLOduration=2.834995623 podStartE2EDuration="5.36997565s" podCreationTimestamp="2025-11-29 03:19:08 +0000 UTC" firstStartedPulling="2025-11-29 03:19:10.289870753 +0000 UTC m=+7693.462020640" lastFinishedPulling="2025-11-29 03:19:12.82485081 +0000 UTC m=+7695.997000667" observedRunningTime="2025-11-29 03:19:13.363176045 +0000 UTC m=+7696.535325932" watchObservedRunningTime="2025-11-29 03:19:13.36997565 +0000 UTC m=+7696.542125517" Nov 29 03:19:18 crc kubenswrapper[4749]: I1129 03:19:18.408005 4749 generic.go:334] "Generic (PLEG): container finished" podID="443e1ef6-779b-44ec-9f24-6a661a47a0a6" containerID="3f1112df385f1a43ec2b935c521f3ed5741c9010f4dbcb96f5a82371309028cb" exitCode=0 Nov 29 03:19:18 crc kubenswrapper[4749]: I1129 03:19:18.408149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" event={"ID":"443e1ef6-779b-44ec-9f24-6a661a47a0a6","Type":"ContainerDied","Data":"3f1112df385f1a43ec2b935c521f3ed5741c9010f4dbcb96f5a82371309028cb"} Nov 29 03:19:19 crc kubenswrapper[4749]: I1129 03:19:19.218786 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:19 crc kubenswrapper[4749]: I1129 03:19:19.219132 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:19 crc kubenswrapper[4749]: I1129 03:19:19.291719 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:19 crc kubenswrapper[4749]: I1129 03:19:19.504082 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:19 crc kubenswrapper[4749]: I1129 03:19:19.560220 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpwh8"] Nov 29 03:19:19 crc kubenswrapper[4749]: I1129 03:19:19.887559 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022439 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ceph\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z54v\" (UniqueName: \"kubernetes.io/projected/443e1ef6-779b-44ec-9f24-6a661a47a0a6-kube-api-access-8z54v\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ssh-key\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022614 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-dhcp-combined-ca-bundle\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-telemetry-combined-ca-bundle\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022835 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-bootstrap-combined-ca-bundle\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022905 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-metadata-combined-ca-bundle\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022956 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-sriov-combined-ca-bundle\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.022997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ovn-combined-ca-bundle\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.023076 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-nova-combined-ca-bundle\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.023158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-libvirt-combined-ca-bundle\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.023220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-inventory\") pod \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\" (UID: \"443e1ef6-779b-44ec-9f24-6a661a47a0a6\") " Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.030799 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.031625 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.031717 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.031805 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443e1ef6-779b-44ec-9f24-6a661a47a0a6-kube-api-access-8z54v" (OuterVolumeSpecName: "kube-api-access-8z54v") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "kube-api-access-8z54v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.032390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ceph" (OuterVolumeSpecName: "ceph") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.032615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.033769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.033784 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.034770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.035982 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.060004 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-inventory" (OuterVolumeSpecName: "inventory") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.066768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "443e1ef6-779b-44ec-9f24-6a661a47a0a6" (UID: "443e1ef6-779b-44ec-9f24-6a661a47a0a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126245 4749 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126283 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126299 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126314 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126329 4749 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126340 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126350 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126364 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126374 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z54v\" (UniqueName: \"kubernetes.io/projected/443e1ef6-779b-44ec-9f24-6a661a47a0a6-kube-api-access-8z54v\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126384 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126394 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.126416 4749 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/443e1ef6-779b-44ec-9f24-6a661a47a0a6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.433500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" event={"ID":"443e1ef6-779b-44ec-9f24-6a661a47a0a6","Type":"ContainerDied","Data":"620d7728b5181cf7f5d150c07679585f3067395e9229f26471efa18cc6251aa4"} Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.433919 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="620d7728b5181cf7f5d150c07679585f3067395e9229f26471efa18cc6251aa4" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.433616 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-bjfr2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.540499 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-kbbt2"] Nov 29 03:19:20 crc kubenswrapper[4749]: E1129 03:19:20.541031 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443e1ef6-779b-44ec-9f24-6a661a47a0a6" containerName="install-certs-openstack-openstack-cell1" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.541047 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="443e1ef6-779b-44ec-9f24-6a661a47a0a6" containerName="install-certs-openstack-openstack-cell1" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.541357 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="443e1ef6-779b-44ec-9f24-6a661a47a0a6" containerName="install-certs-openstack-openstack-cell1" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.542296 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.544134 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.544299 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.544556 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.548339 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.566955 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-kbbt2"] Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.639468 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-inventory\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.639602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.639626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h49d4\" (UniqueName: \"kubernetes.io/projected/28ba02e8-574d-4047-8c56-edbcea634220-kube-api-access-h49d4\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.639666 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ceph\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.741724 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.742080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h49d4\" (UniqueName: \"kubernetes.io/projected/28ba02e8-574d-4047-8c56-edbcea634220-kube-api-access-h49d4\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.742250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ceph\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.742438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-inventory\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.747117 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-inventory\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.747951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.748622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ceph\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.759484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h49d4\" (UniqueName: \"kubernetes.io/projected/28ba02e8-574d-4047-8c56-edbcea634220-kube-api-access-h49d4\") pod \"ceph-client-openstack-openstack-cell1-kbbt2\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:20 crc kubenswrapper[4749]: I1129 03:19:20.916658 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:21 crc kubenswrapper[4749]: I1129 03:19:21.444725 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qpwh8" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerName="registry-server" containerID="cri-o://471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a" gracePeriod=2 Nov 29 03:19:21 crc kubenswrapper[4749]: I1129 03:19:21.536576 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-kbbt2"] Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.064366 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.200148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-utilities\") pod \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.200387 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-catalog-content\") pod \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.200526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd8m5\" (UniqueName: \"kubernetes.io/projected/0caca417-1cff-45bf-98a9-7d6dfd6c5331-kube-api-access-bd8m5\") pod \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\" (UID: \"0caca417-1cff-45bf-98a9-7d6dfd6c5331\") " Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.202365 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-utilities" (OuterVolumeSpecName: "utilities") pod "0caca417-1cff-45bf-98a9-7d6dfd6c5331" (UID: "0caca417-1cff-45bf-98a9-7d6dfd6c5331"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.215455 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0caca417-1cff-45bf-98a9-7d6dfd6c5331-kube-api-access-bd8m5" (OuterVolumeSpecName: "kube-api-access-bd8m5") pod "0caca417-1cff-45bf-98a9-7d6dfd6c5331" (UID: "0caca417-1cff-45bf-98a9-7d6dfd6c5331"). InnerVolumeSpecName "kube-api-access-bd8m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.222179 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0caca417-1cff-45bf-98a9-7d6dfd6c5331" (UID: "0caca417-1cff-45bf-98a9-7d6dfd6c5331"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.304634 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.304677 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0caca417-1cff-45bf-98a9-7d6dfd6c5331-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.304691 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd8m5\" (UniqueName: \"kubernetes.io/projected/0caca417-1cff-45bf-98a9-7d6dfd6c5331-kube-api-access-bd8m5\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.464838 4749 generic.go:334] "Generic (PLEG): container finished" podID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerID="471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a" exitCode=0 Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.464899 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpwh8" event={"ID":"0caca417-1cff-45bf-98a9-7d6dfd6c5331","Type":"ContainerDied","Data":"471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a"} Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.464928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qpwh8" event={"ID":"0caca417-1cff-45bf-98a9-7d6dfd6c5331","Type":"ContainerDied","Data":"f325fc57dc416a549e0a0798bf1338f12b458c059f7befc06bd413a6332f895b"} Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.464946 4749 scope.go:117] "RemoveContainer" containerID="471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.464936 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qpwh8" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.468138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" event={"ID":"28ba02e8-574d-4047-8c56-edbcea634220","Type":"ContainerStarted","Data":"6c365d0728ff53dcba449db8292f778de2e4405b534dffb7e67bf5d066eb9667"} Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.513259 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpwh8"] Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.513363 4749 scope.go:117] "RemoveContainer" containerID="7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.523661 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qpwh8"] Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.536665 4749 scope.go:117] "RemoveContainer" containerID="648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.558302 4749 scope.go:117] "RemoveContainer" containerID="471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a" Nov 29 03:19:22 crc kubenswrapper[4749]: E1129 03:19:22.560423 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a\": container with ID starting with 471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a not found: ID does not exist" containerID="471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.560473 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a"} err="failed to get container status \"471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a\": rpc error: code = NotFound desc = could not find container \"471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a\": container with ID starting with 471f73e34a37bc92a66112bb11ffbafb2a9a0d77016c52a5f001c1d62449164a not found: ID does not exist" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.560506 4749 scope.go:117] "RemoveContainer" containerID="7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d" Nov 29 03:19:22 crc kubenswrapper[4749]: E1129 03:19:22.560921 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d\": container with ID starting with 7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d not found: ID does not exist" containerID="7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.560961 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d"} err="failed to get container status \"7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d\": rpc error: code = NotFound desc = could not find container \"7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d\": container with ID starting with 7477317ebe6dd577b29ceed3b53d2c103d7d6fe3efd4c1e47cedc4d628658f1d not found: ID does not exist" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.560989 4749 scope.go:117] "RemoveContainer" containerID="648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef" Nov 29 03:19:22 crc kubenswrapper[4749]: E1129 03:19:22.561399 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef\": container with ID starting with 648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef not found: ID does not exist" containerID="648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef" Nov 29 03:19:22 crc kubenswrapper[4749]: I1129 03:19:22.561427 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef"} err="failed to get container status \"648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef\": rpc error: code = NotFound desc = could not find container \"648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef\": container with ID starting with 648e0c7dbca7bbc45fdf19dd13e82b8fb62e7e36e88875cfaaee00da285471ef not found: ID does not exist" Nov 29 03:19:23 crc kubenswrapper[4749]: I1129 03:19:23.093622 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" path="/var/lib/kubelet/pods/0caca417-1cff-45bf-98a9-7d6dfd6c5331/volumes" Nov 29 03:19:23 crc kubenswrapper[4749]: I1129 03:19:23.483122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" event={"ID":"28ba02e8-574d-4047-8c56-edbcea634220","Type":"ContainerStarted","Data":"bd1f13d4a3c2317289e93a87fac488264c799faa739ac8418fa6b8b28e68daa5"} Nov 29 03:19:23 crc kubenswrapper[4749]: I1129 03:19:23.515741 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" podStartSLOduration=2.843447654 podStartE2EDuration="3.51566237s" podCreationTimestamp="2025-11-29 03:19:20 +0000 UTC" firstStartedPulling="2025-11-29 03:19:21.572294842 +0000 UTC m=+7704.744444699" lastFinishedPulling="2025-11-29 03:19:22.244509558 +0000 UTC m=+7705.416659415" observedRunningTime="2025-11-29 03:19:23.511489359 +0000 UTC m=+7706.683639236" watchObservedRunningTime="2025-11-29 03:19:23.51566237 +0000 UTC m=+7706.687812257" Nov 29 03:19:25 crc kubenswrapper[4749]: I1129 03:19:25.374706 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:19:25 crc kubenswrapper[4749]: I1129 03:19:25.375382 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:19:28 crc kubenswrapper[4749]: I1129 03:19:28.556906 4749 generic.go:334] "Generic (PLEG): container finished" podID="28ba02e8-574d-4047-8c56-edbcea634220" containerID="bd1f13d4a3c2317289e93a87fac488264c799faa739ac8418fa6b8b28e68daa5" exitCode=0 Nov 29 03:19:28 crc kubenswrapper[4749]: I1129 03:19:28.558826 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" event={"ID":"28ba02e8-574d-4047-8c56-edbcea634220","Type":"ContainerDied","Data":"bd1f13d4a3c2317289e93a87fac488264c799faa739ac8418fa6b8b28e68daa5"} Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.014172 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.087702 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ssh-key\") pod \"28ba02e8-574d-4047-8c56-edbcea634220\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.087795 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h49d4\" (UniqueName: \"kubernetes.io/projected/28ba02e8-574d-4047-8c56-edbcea634220-kube-api-access-h49d4\") pod \"28ba02e8-574d-4047-8c56-edbcea634220\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.087865 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-inventory\") pod \"28ba02e8-574d-4047-8c56-edbcea634220\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.087945 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ceph\") pod \"28ba02e8-574d-4047-8c56-edbcea634220\" (UID: \"28ba02e8-574d-4047-8c56-edbcea634220\") " Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.093267 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ba02e8-574d-4047-8c56-edbcea634220-kube-api-access-h49d4" (OuterVolumeSpecName: "kube-api-access-h49d4") pod "28ba02e8-574d-4047-8c56-edbcea634220" (UID: "28ba02e8-574d-4047-8c56-edbcea634220"). InnerVolumeSpecName "kube-api-access-h49d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.097278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ceph" (OuterVolumeSpecName: "ceph") pod "28ba02e8-574d-4047-8c56-edbcea634220" (UID: "28ba02e8-574d-4047-8c56-edbcea634220"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.116585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-inventory" (OuterVolumeSpecName: "inventory") pod "28ba02e8-574d-4047-8c56-edbcea634220" (UID: "28ba02e8-574d-4047-8c56-edbcea634220"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.124743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "28ba02e8-574d-4047-8c56-edbcea634220" (UID: "28ba02e8-574d-4047-8c56-edbcea634220"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.190941 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.191732 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h49d4\" (UniqueName: \"kubernetes.io/projected/28ba02e8-574d-4047-8c56-edbcea634220-kube-api-access-h49d4\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.191745 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.191772 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28ba02e8-574d-4047-8c56-edbcea634220-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.590579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" event={"ID":"28ba02e8-574d-4047-8c56-edbcea634220","Type":"ContainerDied","Data":"6c365d0728ff53dcba449db8292f778de2e4405b534dffb7e67bf5d066eb9667"} Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.590679 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c365d0728ff53dcba449db8292f778de2e4405b534dffb7e67bf5d066eb9667" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.590724 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-kbbt2" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.683559 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-dlzxk"] Nov 29 03:19:30 crc kubenswrapper[4749]: E1129 03:19:30.688839 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerName="extract-content" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.688886 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerName="extract-content" Nov 29 03:19:30 crc kubenswrapper[4749]: E1129 03:19:30.688911 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerName="extract-utilities" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.688920 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerName="extract-utilities" Nov 29 03:19:30 crc kubenswrapper[4749]: E1129 03:19:30.688934 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerName="registry-server" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.688942 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerName="registry-server" Nov 29 03:19:30 crc kubenswrapper[4749]: E1129 03:19:30.688966 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ba02e8-574d-4047-8c56-edbcea634220" containerName="ceph-client-openstack-openstack-cell1" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.688975 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ba02e8-574d-4047-8c56-edbcea634220" containerName="ceph-client-openstack-openstack-cell1" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.689327 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0caca417-1cff-45bf-98a9-7d6dfd6c5331" containerName="registry-server" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.689344 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ba02e8-574d-4047-8c56-edbcea634220" containerName="ceph-client-openstack-openstack-cell1" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.690398 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.693970 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.694306 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.694397 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.694887 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.695120 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.695760 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-dlzxk"] Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.807046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-inventory\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.807116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.807239 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ssh-key\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.807335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.807374 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ceph\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.807430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddvc9\" (UniqueName: \"kubernetes.io/projected/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-kube-api-access-ddvc9\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.909665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.909849 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ssh-key\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.909963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.910017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ceph\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.910080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddvc9\" (UniqueName: \"kubernetes.io/projected/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-kube-api-access-ddvc9\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.910191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-inventory\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.910863 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.916499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.916540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ssh-key\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.917038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-inventory\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.917681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ceph\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:30 crc kubenswrapper[4749]: I1129 03:19:30.933776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddvc9\" (UniqueName: \"kubernetes.io/projected/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-kube-api-access-ddvc9\") pod \"ovn-openstack-openstack-cell1-dlzxk\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:31 crc kubenswrapper[4749]: I1129 03:19:31.053125 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:19:31 crc kubenswrapper[4749]: I1129 03:19:31.676102 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-dlzxk"] Nov 29 03:19:31 crc kubenswrapper[4749]: W1129 03:19:31.677671 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod033a5a84_a10e_4a5f_a7f5_de6f348f5b32.slice/crio-35d2f8d5438680b2f82e0c741839ec2b65488930c3aeedf8573b98a2cfd34ddb WatchSource:0}: Error finding container 35d2f8d5438680b2f82e0c741839ec2b65488930c3aeedf8573b98a2cfd34ddb: Status 404 returned error can't find the container with id 35d2f8d5438680b2f82e0c741839ec2b65488930c3aeedf8573b98a2cfd34ddb Nov 29 03:19:32 crc kubenswrapper[4749]: I1129 03:19:32.616509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" event={"ID":"033a5a84-a10e-4a5f-a7f5-de6f348f5b32","Type":"ContainerStarted","Data":"457ed7ffde59cddc1aa650700ece819d7dd963936cc481bbfd2c0874e42a6749"} Nov 29 03:19:32 crc kubenswrapper[4749]: I1129 03:19:32.616555 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" event={"ID":"033a5a84-a10e-4a5f-a7f5-de6f348f5b32","Type":"ContainerStarted","Data":"35d2f8d5438680b2f82e0c741839ec2b65488930c3aeedf8573b98a2cfd34ddb"} Nov 29 03:19:32 crc kubenswrapper[4749]: I1129 03:19:32.652824 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" podStartSLOduration=2.194883384 podStartE2EDuration="2.652803126s" podCreationTimestamp="2025-11-29 03:19:30 +0000 UTC" firstStartedPulling="2025-11-29 03:19:31.680687416 +0000 UTC m=+7714.852837273" lastFinishedPulling="2025-11-29 03:19:32.138607158 +0000 UTC m=+7715.310757015" observedRunningTime="2025-11-29 03:19:32.642179208 +0000 UTC m=+7715.814329095" watchObservedRunningTime="2025-11-29 03:19:32.652803126 +0000 UTC m=+7715.824953023" Nov 29 03:19:55 crc kubenswrapper[4749]: I1129 03:19:55.373961 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:19:55 crc kubenswrapper[4749]: I1129 03:19:55.374488 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:20:25 crc kubenswrapper[4749]: I1129 03:20:25.373962 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:20:25 crc kubenswrapper[4749]: I1129 03:20:25.374763 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:20:25 crc kubenswrapper[4749]: I1129 03:20:25.374826 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:20:25 crc kubenswrapper[4749]: I1129 03:20:25.376245 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:20:25 crc kubenswrapper[4749]: I1129 03:20:25.376370 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" gracePeriod=600 Nov 29 03:20:25 crc kubenswrapper[4749]: E1129 03:20:25.508279 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:20:26 crc kubenswrapper[4749]: I1129 03:20:26.259327 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" exitCode=0 Nov 29 03:20:26 crc kubenswrapper[4749]: I1129 03:20:26.259390 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0"} Nov 29 03:20:26 crc kubenswrapper[4749]: I1129 03:20:26.259698 4749 scope.go:117] "RemoveContainer" containerID="baca58002eea9c4ed127f05274b93052cce5e573c579199bacc6fbe8d465ca4c" Nov 29 03:20:26 crc kubenswrapper[4749]: I1129 03:20:26.261406 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:20:26 crc kubenswrapper[4749]: E1129 03:20:26.261993 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:20:37 crc kubenswrapper[4749]: I1129 03:20:37.081334 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:20:37 crc kubenswrapper[4749]: E1129 03:20:37.082269 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:20:41 crc kubenswrapper[4749]: I1129 03:20:41.462175 4749 generic.go:334] "Generic (PLEG): container finished" podID="033a5a84-a10e-4a5f-a7f5-de6f348f5b32" containerID="457ed7ffde59cddc1aa650700ece819d7dd963936cc481bbfd2c0874e42a6749" exitCode=0 Nov 29 03:20:41 crc kubenswrapper[4749]: I1129 03:20:41.462292 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" event={"ID":"033a5a84-a10e-4a5f-a7f5-de6f348f5b32","Type":"ContainerDied","Data":"457ed7ffde59cddc1aa650700ece819d7dd963936cc481bbfd2c0874e42a6749"} Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.038558 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.054565 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ssh-key\") pod \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.054673 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddvc9\" (UniqueName: \"kubernetes.io/projected/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-kube-api-access-ddvc9\") pod \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.054888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-inventory\") pod \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.055077 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovn-combined-ca-bundle\") pod \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.055257 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovncontroller-config-0\") pod \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.055329 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ceph\") pod \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\" (UID: \"033a5a84-a10e-4a5f-a7f5-de6f348f5b32\") " Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.064787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-kube-api-access-ddvc9" (OuterVolumeSpecName: "kube-api-access-ddvc9") pod "033a5a84-a10e-4a5f-a7f5-de6f348f5b32" (UID: "033a5a84-a10e-4a5f-a7f5-de6f348f5b32"). InnerVolumeSpecName "kube-api-access-ddvc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.067732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ceph" (OuterVolumeSpecName: "ceph") pod "033a5a84-a10e-4a5f-a7f5-de6f348f5b32" (UID: "033a5a84-a10e-4a5f-a7f5-de6f348f5b32"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.071881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "033a5a84-a10e-4a5f-a7f5-de6f348f5b32" (UID: "033a5a84-a10e-4a5f-a7f5-de6f348f5b32"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.088776 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "033a5a84-a10e-4a5f-a7f5-de6f348f5b32" (UID: "033a5a84-a10e-4a5f-a7f5-de6f348f5b32"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.092366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-inventory" (OuterVolumeSpecName: "inventory") pod "033a5a84-a10e-4a5f-a7f5-de6f348f5b32" (UID: "033a5a84-a10e-4a5f-a7f5-de6f348f5b32"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.106052 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "033a5a84-a10e-4a5f-a7f5-de6f348f5b32" (UID: "033a5a84-a10e-4a5f-a7f5-de6f348f5b32"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.158276 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.158304 4749 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.158317 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.158328 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.158339 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddvc9\" (UniqueName: \"kubernetes.io/projected/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-kube-api-access-ddvc9\") on node \"crc\" DevicePath \"\"" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.158350 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/033a5a84-a10e-4a5f-a7f5-de6f348f5b32-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.501601 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" event={"ID":"033a5a84-a10e-4a5f-a7f5-de6f348f5b32","Type":"ContainerDied","Data":"35d2f8d5438680b2f82e0c741839ec2b65488930c3aeedf8573b98a2cfd34ddb"} Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.501903 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35d2f8d5438680b2f82e0c741839ec2b65488930c3aeedf8573b98a2cfd34ddb" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.501776 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-dlzxk" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.660830 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-c9ztz"] Nov 29 03:20:43 crc kubenswrapper[4749]: E1129 03:20:43.661451 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033a5a84-a10e-4a5f-a7f5-de6f348f5b32" containerName="ovn-openstack-openstack-cell1" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.661478 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="033a5a84-a10e-4a5f-a7f5-de6f348f5b32" containerName="ovn-openstack-openstack-cell1" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.661746 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="033a5a84-a10e-4a5f-a7f5-de6f348f5b32" containerName="ovn-openstack-openstack-cell1" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.662661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.668787 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.669071 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.669370 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.669502 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.669578 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.669542 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.670437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.670507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.670615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.670663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.674925 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.675155 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g4bt\" (UniqueName: \"kubernetes.io/projected/0d40b0dd-b6a5-482b-87f9-d2780c14f322-kube-api-access-6g4bt\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.675347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.681631 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-c9ztz"] Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.777545 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.777618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g4bt\" (UniqueName: \"kubernetes.io/projected/0d40b0dd-b6a5-482b-87f9-d2780c14f322-kube-api-access-6g4bt\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.777685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.777790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.777827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.777903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.777944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.783159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.783831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.784088 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.784112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.784713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.784831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.797603 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g4bt\" (UniqueName: \"kubernetes.io/projected/0d40b0dd-b6a5-482b-87f9-d2780c14f322-kube-api-access-6g4bt\") pod \"neutron-metadata-openstack-openstack-cell1-c9ztz\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:43 crc kubenswrapper[4749]: I1129 03:20:43.997364 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:20:44 crc kubenswrapper[4749]: W1129 03:20:44.565867 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d40b0dd_b6a5_482b_87f9_d2780c14f322.slice/crio-fbd762298ee2cd78dd77679a7f84b3defed602c79334227bad6fee21e8b4a513 WatchSource:0}: Error finding container fbd762298ee2cd78dd77679a7f84b3defed602c79334227bad6fee21e8b4a513: Status 404 returned error can't find the container with id fbd762298ee2cd78dd77679a7f84b3defed602c79334227bad6fee21e8b4a513 Nov 29 03:20:44 crc kubenswrapper[4749]: I1129 03:20:44.568312 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-c9ztz"] Nov 29 03:20:45 crc kubenswrapper[4749]: I1129 03:20:45.555645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" event={"ID":"0d40b0dd-b6a5-482b-87f9-d2780c14f322","Type":"ContainerStarted","Data":"d5597ae1244bea13636d62c900b8bfb0fd706f56a8fd073d3980899f55051e09"} Nov 29 03:20:45 crc kubenswrapper[4749]: I1129 03:20:45.556300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" event={"ID":"0d40b0dd-b6a5-482b-87f9-d2780c14f322","Type":"ContainerStarted","Data":"fbd762298ee2cd78dd77679a7f84b3defed602c79334227bad6fee21e8b4a513"} Nov 29 03:20:45 crc kubenswrapper[4749]: I1129 03:20:45.588890 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" podStartSLOduration=2.079529607 podStartE2EDuration="2.588864586s" podCreationTimestamp="2025-11-29 03:20:43 +0000 UTC" firstStartedPulling="2025-11-29 03:20:44.570898324 +0000 UTC m=+7787.743048181" lastFinishedPulling="2025-11-29 03:20:45.080233273 +0000 UTC m=+7788.252383160" observedRunningTime="2025-11-29 03:20:45.578861324 +0000 UTC m=+7788.751011241" watchObservedRunningTime="2025-11-29 03:20:45.588864586 +0000 UTC m=+7788.761014483" Nov 29 03:20:52 crc kubenswrapper[4749]: I1129 03:20:52.076493 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:20:52 crc kubenswrapper[4749]: E1129 03:20:52.077343 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:21:03 crc kubenswrapper[4749]: I1129 03:21:03.074951 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:21:03 crc kubenswrapper[4749]: E1129 03:21:03.075922 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:21:16 crc kubenswrapper[4749]: I1129 03:21:16.075753 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:21:16 crc kubenswrapper[4749]: E1129 03:21:16.077082 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:21:31 crc kubenswrapper[4749]: I1129 03:21:31.076558 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:21:31 crc kubenswrapper[4749]: E1129 03:21:31.077529 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:21:40 crc kubenswrapper[4749]: I1129 03:21:40.278576 4749 generic.go:334] "Generic (PLEG): container finished" podID="0d40b0dd-b6a5-482b-87f9-d2780c14f322" containerID="d5597ae1244bea13636d62c900b8bfb0fd706f56a8fd073d3980899f55051e09" exitCode=0 Nov 29 03:21:40 crc kubenswrapper[4749]: I1129 03:21:40.278905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" event={"ID":"0d40b0dd-b6a5-482b-87f9-d2780c14f322","Type":"ContainerDied","Data":"d5597ae1244bea13636d62c900b8bfb0fd706f56a8fd073d3980899f55051e09"} Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.310713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" event={"ID":"0d40b0dd-b6a5-482b-87f9-d2780c14f322","Type":"ContainerDied","Data":"fbd762298ee2cd78dd77679a7f84b3defed602c79334227bad6fee21e8b4a513"} Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.311356 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbd762298ee2cd78dd77679a7f84b3defed602c79334227bad6fee21e8b4a513" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.379636 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.493570 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-inventory\") pod \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.493839 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-nova-metadata-neutron-config-0\") pod \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.493975 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-ovn-metadata-agent-neutron-config-0\") pod \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.494137 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ceph\") pod \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.494264 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ssh-key\") pod \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.494412 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g4bt\" (UniqueName: \"kubernetes.io/projected/0d40b0dd-b6a5-482b-87f9-d2780c14f322-kube-api-access-6g4bt\") pod \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.494957 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-metadata-combined-ca-bundle\") pod \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\" (UID: \"0d40b0dd-b6a5-482b-87f9-d2780c14f322\") " Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.501326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d40b0dd-b6a5-482b-87f9-d2780c14f322-kube-api-access-6g4bt" (OuterVolumeSpecName: "kube-api-access-6g4bt") pod "0d40b0dd-b6a5-482b-87f9-d2780c14f322" (UID: "0d40b0dd-b6a5-482b-87f9-d2780c14f322"). InnerVolumeSpecName "kube-api-access-6g4bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.519459 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ceph" (OuterVolumeSpecName: "ceph") pod "0d40b0dd-b6a5-482b-87f9-d2780c14f322" (UID: "0d40b0dd-b6a5-482b-87f9-d2780c14f322"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.521451 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0d40b0dd-b6a5-482b-87f9-d2780c14f322" (UID: "0d40b0dd-b6a5-482b-87f9-d2780c14f322"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.529564 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d40b0dd-b6a5-482b-87f9-d2780c14f322" (UID: "0d40b0dd-b6a5-482b-87f9-d2780c14f322"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.537225 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "0d40b0dd-b6a5-482b-87f9-d2780c14f322" (UID: "0d40b0dd-b6a5-482b-87f9-d2780c14f322"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.545681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-inventory" (OuterVolumeSpecName: "inventory") pod "0d40b0dd-b6a5-482b-87f9-d2780c14f322" (UID: "0d40b0dd-b6a5-482b-87f9-d2780c14f322"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.554255 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "0d40b0dd-b6a5-482b-87f9-d2780c14f322" (UID: "0d40b0dd-b6a5-482b-87f9-d2780c14f322"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.598475 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.598753 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.598862 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.599001 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.599101 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.599192 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g4bt\" (UniqueName: \"kubernetes.io/projected/0d40b0dd-b6a5-482b-87f9-d2780c14f322-kube-api-access-6g4bt\") on node \"crc\" DevicePath \"\"" Nov 29 03:21:42 crc kubenswrapper[4749]: I1129 03:21:42.599313 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d40b0dd-b6a5-482b-87f9-d2780c14f322-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.333750 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-c9ztz" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.562417 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-ms5ld"] Nov 29 03:21:43 crc kubenswrapper[4749]: E1129 03:21:43.563035 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d40b0dd-b6a5-482b-87f9-d2780c14f322" containerName="neutron-metadata-openstack-openstack-cell1" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.563095 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d40b0dd-b6a5-482b-87f9-d2780c14f322" containerName="neutron-metadata-openstack-openstack-cell1" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.563352 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d40b0dd-b6a5-482b-87f9-d2780c14f322" containerName="neutron-metadata-openstack-openstack-cell1" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.564101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.567992 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.568292 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.568610 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.568785 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.569631 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.594463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-ms5ld"] Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.624588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.624883 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.625058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ssh-key\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.625258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7jt5\" (UniqueName: \"kubernetes.io/projected/402e3628-3422-4ebd-b3aa-aa8b36553f92-kube-api-access-j7jt5\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.625386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-inventory\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.625572 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ceph\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.727634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ssh-key\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.727974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7jt5\" (UniqueName: \"kubernetes.io/projected/402e3628-3422-4ebd-b3aa-aa8b36553f92-kube-api-access-j7jt5\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.728003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-inventory\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.728084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ceph\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.728159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.728183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.735934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-inventory\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.735959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ssh-key\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.736431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.737742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.740604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ceph\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.745772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7jt5\" (UniqueName: \"kubernetes.io/projected/402e3628-3422-4ebd-b3aa-aa8b36553f92-kube-api-access-j7jt5\") pod \"libvirt-openstack-openstack-cell1-ms5ld\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:43 crc kubenswrapper[4749]: I1129 03:21:43.943237 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:21:44 crc kubenswrapper[4749]: I1129 03:21:44.543055 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-ms5ld"] Nov 29 03:21:45 crc kubenswrapper[4749]: I1129 03:21:45.075074 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:21:45 crc kubenswrapper[4749]: E1129 03:21:45.075607 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:21:45 crc kubenswrapper[4749]: I1129 03:21:45.355827 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" event={"ID":"402e3628-3422-4ebd-b3aa-aa8b36553f92","Type":"ContainerStarted","Data":"6739e66121a20f0f574a517267ae8049d323cd0260849d3f82985564b02b43ae"} Nov 29 03:21:46 crc kubenswrapper[4749]: I1129 03:21:46.369145 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" event={"ID":"402e3628-3422-4ebd-b3aa-aa8b36553f92","Type":"ContainerStarted","Data":"fcff08365669b0cfd90699a66fb1e870980c88255b9695c9c4f00ff964045a48"} Nov 29 03:21:46 crc kubenswrapper[4749]: I1129 03:21:46.395636 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" podStartSLOduration=2.790608255 podStartE2EDuration="3.395610899s" podCreationTimestamp="2025-11-29 03:21:43 +0000 UTC" firstStartedPulling="2025-11-29 03:21:44.553064119 +0000 UTC m=+7847.725213986" lastFinishedPulling="2025-11-29 03:21:45.158066773 +0000 UTC m=+7848.330216630" observedRunningTime="2025-11-29 03:21:46.394457351 +0000 UTC m=+7849.566607228" watchObservedRunningTime="2025-11-29 03:21:46.395610899 +0000 UTC m=+7849.567760766" Nov 29 03:21:56 crc kubenswrapper[4749]: I1129 03:21:56.076449 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:21:56 crc kubenswrapper[4749]: E1129 03:21:56.077050 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:22:07 crc kubenswrapper[4749]: I1129 03:22:07.085397 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:22:07 crc kubenswrapper[4749]: E1129 03:22:07.086283 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:22:18 crc kubenswrapper[4749]: I1129 03:22:18.075377 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:22:18 crc kubenswrapper[4749]: E1129 03:22:18.076257 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:22:29 crc kubenswrapper[4749]: I1129 03:22:29.075320 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:22:29 crc kubenswrapper[4749]: E1129 03:22:29.076234 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:22:42 crc kubenswrapper[4749]: I1129 03:22:42.075615 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:22:42 crc kubenswrapper[4749]: E1129 03:22:42.078237 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:22:53 crc kubenswrapper[4749]: I1129 03:22:53.076288 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:22:53 crc kubenswrapper[4749]: E1129 03:22:53.077164 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:23:04 crc kubenswrapper[4749]: I1129 03:23:04.075889 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:23:04 crc kubenswrapper[4749]: E1129 03:23:04.077023 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:23:15 crc kubenswrapper[4749]: I1129 03:23:15.075382 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:23:15 crc kubenswrapper[4749]: E1129 03:23:15.076593 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:23:30 crc kubenswrapper[4749]: I1129 03:23:30.076472 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:23:30 crc kubenswrapper[4749]: E1129 03:23:30.077962 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.005105 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qvk7s"] Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.010029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.016860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qvk7s"] Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.071324 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-catalog-content\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.071393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-utilities\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.071817 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6jmv\" (UniqueName: \"kubernetes.io/projected/d9d501aa-874b-4b4b-8597-610530a577ca-kube-api-access-k6jmv\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.075260 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:23:42 crc kubenswrapper[4749]: E1129 03:23:42.075583 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.174329 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6jmv\" (UniqueName: \"kubernetes.io/projected/d9d501aa-874b-4b4b-8597-610530a577ca-kube-api-access-k6jmv\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.174450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-catalog-content\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.174500 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-utilities\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.175002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-utilities\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.175587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-catalog-content\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.194371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6jmv\" (UniqueName: \"kubernetes.io/projected/d9d501aa-874b-4b4b-8597-610530a577ca-kube-api-access-k6jmv\") pod \"community-operators-qvk7s\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.345590 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:42 crc kubenswrapper[4749]: I1129 03:23:42.920929 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qvk7s"] Nov 29 03:23:42 crc kubenswrapper[4749]: W1129 03:23:42.936887 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9d501aa_874b_4b4b_8597_610530a577ca.slice/crio-ca565b7e2a29131c30ec343a982483ccae1f02921ae2c845bd5370ec12a17beb WatchSource:0}: Error finding container ca565b7e2a29131c30ec343a982483ccae1f02921ae2c845bd5370ec12a17beb: Status 404 returned error can't find the container with id ca565b7e2a29131c30ec343a982483ccae1f02921ae2c845bd5370ec12a17beb Nov 29 03:23:43 crc kubenswrapper[4749]: I1129 03:23:43.852863 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9d501aa-874b-4b4b-8597-610530a577ca" containerID="e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8" exitCode=0 Nov 29 03:23:43 crc kubenswrapper[4749]: I1129 03:23:43.852937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvk7s" event={"ID":"d9d501aa-874b-4b4b-8597-610530a577ca","Type":"ContainerDied","Data":"e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8"} Nov 29 03:23:43 crc kubenswrapper[4749]: I1129 03:23:43.853656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvk7s" event={"ID":"d9d501aa-874b-4b4b-8597-610530a577ca","Type":"ContainerStarted","Data":"ca565b7e2a29131c30ec343a982483ccae1f02921ae2c845bd5370ec12a17beb"} Nov 29 03:23:43 crc kubenswrapper[4749]: I1129 03:23:43.858449 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:23:44 crc kubenswrapper[4749]: I1129 03:23:44.865593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvk7s" event={"ID":"d9d501aa-874b-4b4b-8597-610530a577ca","Type":"ContainerStarted","Data":"6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71"} Nov 29 03:23:45 crc kubenswrapper[4749]: I1129 03:23:45.878266 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9d501aa-874b-4b4b-8597-610530a577ca" containerID="6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71" exitCode=0 Nov 29 03:23:45 crc kubenswrapper[4749]: I1129 03:23:45.878340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvk7s" event={"ID":"d9d501aa-874b-4b4b-8597-610530a577ca","Type":"ContainerDied","Data":"6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71"} Nov 29 03:23:46 crc kubenswrapper[4749]: I1129 03:23:46.892753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvk7s" event={"ID":"d9d501aa-874b-4b4b-8597-610530a577ca","Type":"ContainerStarted","Data":"797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e"} Nov 29 03:23:46 crc kubenswrapper[4749]: I1129 03:23:46.912238 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qvk7s" podStartSLOduration=3.49552042 podStartE2EDuration="5.912218635s" podCreationTimestamp="2025-11-29 03:23:41 +0000 UTC" firstStartedPulling="2025-11-29 03:23:43.857998546 +0000 UTC m=+7967.030148443" lastFinishedPulling="2025-11-29 03:23:46.274696761 +0000 UTC m=+7969.446846658" observedRunningTime="2025-11-29 03:23:46.91081288 +0000 UTC m=+7970.082962757" watchObservedRunningTime="2025-11-29 03:23:46.912218635 +0000 UTC m=+7970.084368492" Nov 29 03:23:52 crc kubenswrapper[4749]: I1129 03:23:52.346304 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:52 crc kubenswrapper[4749]: I1129 03:23:52.348495 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:52 crc kubenswrapper[4749]: I1129 03:23:52.424952 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:53 crc kubenswrapper[4749]: I1129 03:23:53.067477 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:53 crc kubenswrapper[4749]: I1129 03:23:53.138900 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qvk7s"] Nov 29 03:23:54 crc kubenswrapper[4749]: I1129 03:23:54.075902 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:23:54 crc kubenswrapper[4749]: E1129 03:23:54.076759 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:23:54 crc kubenswrapper[4749]: I1129 03:23:54.998803 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qvk7s" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" containerName="registry-server" containerID="cri-o://797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e" gracePeriod=2 Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.493351 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.615610 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6jmv\" (UniqueName: \"kubernetes.io/projected/d9d501aa-874b-4b4b-8597-610530a577ca-kube-api-access-k6jmv\") pod \"d9d501aa-874b-4b4b-8597-610530a577ca\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.615735 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-catalog-content\") pod \"d9d501aa-874b-4b4b-8597-610530a577ca\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.615792 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-utilities\") pod \"d9d501aa-874b-4b4b-8597-610530a577ca\" (UID: \"d9d501aa-874b-4b4b-8597-610530a577ca\") " Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.617595 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-utilities" (OuterVolumeSpecName: "utilities") pod "d9d501aa-874b-4b4b-8597-610530a577ca" (UID: "d9d501aa-874b-4b4b-8597-610530a577ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.622422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d501aa-874b-4b4b-8597-610530a577ca-kube-api-access-k6jmv" (OuterVolumeSpecName: "kube-api-access-k6jmv") pod "d9d501aa-874b-4b4b-8597-610530a577ca" (UID: "d9d501aa-874b-4b4b-8597-610530a577ca"). InnerVolumeSpecName "kube-api-access-k6jmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.687626 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9d501aa-874b-4b4b-8597-610530a577ca" (UID: "d9d501aa-874b-4b4b-8597-610530a577ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.718744 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6jmv\" (UniqueName: \"kubernetes.io/projected/d9d501aa-874b-4b4b-8597-610530a577ca-kube-api-access-k6jmv\") on node \"crc\" DevicePath \"\"" Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.718782 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:23:55 crc kubenswrapper[4749]: I1129 03:23:55.718792 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9d501aa-874b-4b4b-8597-610530a577ca-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.012420 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9d501aa-874b-4b4b-8597-610530a577ca" containerID="797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e" exitCode=0 Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.012484 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qvk7s" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.012487 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvk7s" event={"ID":"d9d501aa-874b-4b4b-8597-610530a577ca","Type":"ContainerDied","Data":"797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e"} Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.012916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qvk7s" event={"ID":"d9d501aa-874b-4b4b-8597-610530a577ca","Type":"ContainerDied","Data":"ca565b7e2a29131c30ec343a982483ccae1f02921ae2c845bd5370ec12a17beb"} Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.012941 4749 scope.go:117] "RemoveContainer" containerID="797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.042574 4749 scope.go:117] "RemoveContainer" containerID="6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.053990 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qvk7s"] Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.063145 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qvk7s"] Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.068228 4749 scope.go:117] "RemoveContainer" containerID="e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.136104 4749 scope.go:117] "RemoveContainer" containerID="797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e" Nov 29 03:23:56 crc kubenswrapper[4749]: E1129 03:23:56.136854 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e\": container with ID starting with 797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e not found: ID does not exist" containerID="797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.136894 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e"} err="failed to get container status \"797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e\": rpc error: code = NotFound desc = could not find container \"797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e\": container with ID starting with 797daf7434ed6b4af607cbe7407b8928a1a88429bef94caef512a1c35d29df2e not found: ID does not exist" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.136922 4749 scope.go:117] "RemoveContainer" containerID="6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71" Nov 29 03:23:56 crc kubenswrapper[4749]: E1129 03:23:56.137393 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71\": container with ID starting with 6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71 not found: ID does not exist" containerID="6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.137414 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71"} err="failed to get container status \"6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71\": rpc error: code = NotFound desc = could not find container \"6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71\": container with ID starting with 6ea4341ac35d61faa239f65c7adc046e2f46503c6683f9edabc02ccc27d5bb71 not found: ID does not exist" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.137429 4749 scope.go:117] "RemoveContainer" containerID="e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8" Nov 29 03:23:56 crc kubenswrapper[4749]: E1129 03:23:56.137736 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8\": container with ID starting with e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8 not found: ID does not exist" containerID="e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8" Nov 29 03:23:56 crc kubenswrapper[4749]: I1129 03:23:56.137753 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8"} err="failed to get container status \"e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8\": rpc error: code = NotFound desc = could not find container \"e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8\": container with ID starting with e80a223ea047ff2f71c35548231b61f3aa1a63eb6b1fda936c71151673b8c9e8 not found: ID does not exist" Nov 29 03:23:57 crc kubenswrapper[4749]: I1129 03:23:57.115961 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" path="/var/lib/kubelet/pods/d9d501aa-874b-4b4b-8597-610530a577ca/volumes" Nov 29 03:24:08 crc kubenswrapper[4749]: I1129 03:24:08.075653 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:24:08 crc kubenswrapper[4749]: E1129 03:24:08.076485 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:24:22 crc kubenswrapper[4749]: I1129 03:24:22.076044 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:24:22 crc kubenswrapper[4749]: E1129 03:24:22.077253 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:24:34 crc kubenswrapper[4749]: I1129 03:24:34.075928 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:24:34 crc kubenswrapper[4749]: E1129 03:24:34.077160 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:24:47 crc kubenswrapper[4749]: I1129 03:24:47.095496 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:24:47 crc kubenswrapper[4749]: E1129 03:24:47.097911 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:25:01 crc kubenswrapper[4749]: I1129 03:25:01.076406 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:25:01 crc kubenswrapper[4749]: E1129 03:25:01.077489 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:25:14 crc kubenswrapper[4749]: I1129 03:25:14.075020 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:25:14 crc kubenswrapper[4749]: E1129 03:25:14.075684 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:25:26 crc kubenswrapper[4749]: I1129 03:25:26.075637 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:25:27 crc kubenswrapper[4749]: I1129 03:25:27.212402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"c88e8c91ecb9b62e36cb8ec0f16c22f036b3b08638e6ba452be1d393633a6642"} Nov 29 03:26:37 crc kubenswrapper[4749]: I1129 03:26:37.138622 4749 generic.go:334] "Generic (PLEG): container finished" podID="402e3628-3422-4ebd-b3aa-aa8b36553f92" containerID="fcff08365669b0cfd90699a66fb1e870980c88255b9695c9c4f00ff964045a48" exitCode=0 Nov 29 03:26:37 crc kubenswrapper[4749]: I1129 03:26:37.138758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" event={"ID":"402e3628-3422-4ebd-b3aa-aa8b36553f92","Type":"ContainerDied","Data":"fcff08365669b0cfd90699a66fb1e870980c88255b9695c9c4f00ff964045a48"} Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.752425 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.848276 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-secret-0\") pod \"402e3628-3422-4ebd-b3aa-aa8b36553f92\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.848824 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ssh-key\") pod \"402e3628-3422-4ebd-b3aa-aa8b36553f92\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.848953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7jt5\" (UniqueName: \"kubernetes.io/projected/402e3628-3422-4ebd-b3aa-aa8b36553f92-kube-api-access-j7jt5\") pod \"402e3628-3422-4ebd-b3aa-aa8b36553f92\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.849574 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ceph\") pod \"402e3628-3422-4ebd-b3aa-aa8b36553f92\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.849718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-combined-ca-bundle\") pod \"402e3628-3422-4ebd-b3aa-aa8b36553f92\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.849855 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-inventory\") pod \"402e3628-3422-4ebd-b3aa-aa8b36553f92\" (UID: \"402e3628-3422-4ebd-b3aa-aa8b36553f92\") " Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.853539 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ceph" (OuterVolumeSpecName: "ceph") pod "402e3628-3422-4ebd-b3aa-aa8b36553f92" (UID: "402e3628-3422-4ebd-b3aa-aa8b36553f92"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.853973 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402e3628-3422-4ebd-b3aa-aa8b36553f92-kube-api-access-j7jt5" (OuterVolumeSpecName: "kube-api-access-j7jt5") pod "402e3628-3422-4ebd-b3aa-aa8b36553f92" (UID: "402e3628-3422-4ebd-b3aa-aa8b36553f92"). InnerVolumeSpecName "kube-api-access-j7jt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.855489 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "402e3628-3422-4ebd-b3aa-aa8b36553f92" (UID: "402e3628-3422-4ebd-b3aa-aa8b36553f92"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.878941 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "402e3628-3422-4ebd-b3aa-aa8b36553f92" (UID: "402e3628-3422-4ebd-b3aa-aa8b36553f92"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.880584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "402e3628-3422-4ebd-b3aa-aa8b36553f92" (UID: "402e3628-3422-4ebd-b3aa-aa8b36553f92"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.883904 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-inventory" (OuterVolumeSpecName: "inventory") pod "402e3628-3422-4ebd-b3aa-aa8b36553f92" (UID: "402e3628-3422-4ebd-b3aa-aa8b36553f92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.953429 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.953462 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7jt5\" (UniqueName: \"kubernetes.io/projected/402e3628-3422-4ebd-b3aa-aa8b36553f92-kube-api-access-j7jt5\") on node \"crc\" DevicePath \"\"" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.953478 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.953487 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.953496 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:26:38 crc kubenswrapper[4749]: I1129 03:26:38.953507 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/402e3628-3422-4ebd-b3aa-aa8b36553f92-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.160849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" event={"ID":"402e3628-3422-4ebd-b3aa-aa8b36553f92","Type":"ContainerDied","Data":"6739e66121a20f0f574a517267ae8049d323cd0260849d3f82985564b02b43ae"} Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.160912 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6739e66121a20f0f574a517267ae8049d323cd0260849d3f82985564b02b43ae" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.160997 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-ms5ld" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.277070 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-wv7hb"] Nov 29 03:26:39 crc kubenswrapper[4749]: E1129 03:26:39.277695 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" containerName="extract-utilities" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.277723 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" containerName="extract-utilities" Nov 29 03:26:39 crc kubenswrapper[4749]: E1129 03:26:39.277746 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" containerName="extract-content" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.277756 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" containerName="extract-content" Nov 29 03:26:39 crc kubenswrapper[4749]: E1129 03:26:39.277782 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" containerName="registry-server" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.277791 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" containerName="registry-server" Nov 29 03:26:39 crc kubenswrapper[4749]: E1129 03:26:39.277814 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402e3628-3422-4ebd-b3aa-aa8b36553f92" containerName="libvirt-openstack-openstack-cell1" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.277823 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="402e3628-3422-4ebd-b3aa-aa8b36553f92" containerName="libvirt-openstack-openstack-cell1" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.278113 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d501aa-874b-4b4b-8597-610530a577ca" containerName="registry-server" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.278145 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="402e3628-3422-4ebd-b3aa-aa8b36553f92" containerName="libvirt-openstack-openstack-cell1" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.279162 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.283010 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.283384 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.283510 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.283658 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.284142 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.284343 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.284467 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.288396 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-wv7hb"] Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360399 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-inventory\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9fs4\" (UniqueName: \"kubernetes.io/projected/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-kube-api-access-d9fs4\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ceph\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.360718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.462627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463137 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-inventory\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463192 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9fs4\" (UniqueName: \"kubernetes.io/projected/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-kube-api-access-d9fs4\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463298 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463381 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ceph\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463433 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.463463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.464694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.464738 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.467027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ceph\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.467390 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.469174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-inventory\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.468951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.470143 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.470213 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.471381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.473339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.485948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9fs4\" (UniqueName: \"kubernetes.io/projected/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-kube-api-access-d9fs4\") pod \"nova-cell1-openstack-openstack-cell1-wv7hb\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:39 crc kubenswrapper[4749]: I1129 03:26:39.625433 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:26:40 crc kubenswrapper[4749]: I1129 03:26:40.272111 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-wv7hb"] Nov 29 03:26:41 crc kubenswrapper[4749]: I1129 03:26:41.184093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" event={"ID":"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec","Type":"ContainerStarted","Data":"11efa8de70fce3bbf9366ef73bd09eaf548e6d551848d4725e7c8c8475d28d9d"} Nov 29 03:26:42 crc kubenswrapper[4749]: I1129 03:26:42.196174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" event={"ID":"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec","Type":"ContainerStarted","Data":"f6c72aa1f3236c8c68998826d342d9a6c96d302fee861b2990b4d01fbe1eee6b"} Nov 29 03:26:42 crc kubenswrapper[4749]: I1129 03:26:42.224624 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" podStartSLOduration=2.595159046 podStartE2EDuration="3.224598866s" podCreationTimestamp="2025-11-29 03:26:39 +0000 UTC" firstStartedPulling="2025-11-29 03:26:40.26518141 +0000 UTC m=+8143.437331257" lastFinishedPulling="2025-11-29 03:26:40.8946212 +0000 UTC m=+8144.066771077" observedRunningTime="2025-11-29 03:26:42.21490412 +0000 UTC m=+8145.387053977" watchObservedRunningTime="2025-11-29 03:26:42.224598866 +0000 UTC m=+8145.396748753" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.348351 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kd9rv"] Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.353145 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.364887 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kd9rv"] Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.530299 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76jzm\" (UniqueName: \"kubernetes.io/projected/2ad0b0af-7ceb-4a35-a463-c6968821a17d-kube-api-access-76jzm\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.530347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-catalog-content\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.530442 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-utilities\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.632784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-utilities\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.633234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76jzm\" (UniqueName: \"kubernetes.io/projected/2ad0b0af-7ceb-4a35-a463-c6968821a17d-kube-api-access-76jzm\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.633259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-catalog-content\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.633735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-catalog-content\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.633989 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-utilities\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.653993 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76jzm\" (UniqueName: \"kubernetes.io/projected/2ad0b0af-7ceb-4a35-a463-c6968821a17d-kube-api-access-76jzm\") pod \"certified-operators-kd9rv\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:53 crc kubenswrapper[4749]: I1129 03:27:53.727430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:27:54 crc kubenswrapper[4749]: I1129 03:27:54.227347 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kd9rv"] Nov 29 03:27:54 crc kubenswrapper[4749]: W1129 03:27:54.228809 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ad0b0af_7ceb_4a35_a463_c6968821a17d.slice/crio-a27352af1f584a3b8a0de6ed71c27afe7a5ec6592742b2df2fad08f6ac7fc43a WatchSource:0}: Error finding container a27352af1f584a3b8a0de6ed71c27afe7a5ec6592742b2df2fad08f6ac7fc43a: Status 404 returned error can't find the container with id a27352af1f584a3b8a0de6ed71c27afe7a5ec6592742b2df2fad08f6ac7fc43a Nov 29 03:27:55 crc kubenswrapper[4749]: I1129 03:27:55.127861 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerID="220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88" exitCode=0 Nov 29 03:27:55 crc kubenswrapper[4749]: I1129 03:27:55.127981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd9rv" event={"ID":"2ad0b0af-7ceb-4a35-a463-c6968821a17d","Type":"ContainerDied","Data":"220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88"} Nov 29 03:27:55 crc kubenswrapper[4749]: I1129 03:27:55.130475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd9rv" event={"ID":"2ad0b0af-7ceb-4a35-a463-c6968821a17d","Type":"ContainerStarted","Data":"a27352af1f584a3b8a0de6ed71c27afe7a5ec6592742b2df2fad08f6ac7fc43a"} Nov 29 03:27:55 crc kubenswrapper[4749]: I1129 03:27:55.374467 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:27:55 crc kubenswrapper[4749]: I1129 03:27:55.374554 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:27:57 crc kubenswrapper[4749]: I1129 03:27:57.156653 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerID="96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc" exitCode=0 Nov 29 03:27:57 crc kubenswrapper[4749]: I1129 03:27:57.157268 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd9rv" event={"ID":"2ad0b0af-7ceb-4a35-a463-c6968821a17d","Type":"ContainerDied","Data":"96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc"} Nov 29 03:27:58 crc kubenswrapper[4749]: I1129 03:27:58.174042 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd9rv" event={"ID":"2ad0b0af-7ceb-4a35-a463-c6968821a17d","Type":"ContainerStarted","Data":"b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b"} Nov 29 03:28:03 crc kubenswrapper[4749]: I1129 03:28:03.728399 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:28:03 crc kubenswrapper[4749]: I1129 03:28:03.728954 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:28:03 crc kubenswrapper[4749]: I1129 03:28:03.810403 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:28:03 crc kubenswrapper[4749]: I1129 03:28:03.837797 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kd9rv" podStartSLOduration=8.338002763 podStartE2EDuration="10.837772068s" podCreationTimestamp="2025-11-29 03:27:53 +0000 UTC" firstStartedPulling="2025-11-29 03:27:55.129578496 +0000 UTC m=+8218.301728353" lastFinishedPulling="2025-11-29 03:27:57.629347791 +0000 UTC m=+8220.801497658" observedRunningTime="2025-11-29 03:27:58.208749316 +0000 UTC m=+8221.380899243" watchObservedRunningTime="2025-11-29 03:28:03.837772068 +0000 UTC m=+8227.009921955" Nov 29 03:28:04 crc kubenswrapper[4749]: I1129 03:28:04.325298 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:28:04 crc kubenswrapper[4749]: I1129 03:28:04.383356 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kd9rv"] Nov 29 03:28:06 crc kubenswrapper[4749]: I1129 03:28:06.262727 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kd9rv" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerName="registry-server" containerID="cri-o://b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b" gracePeriod=2 Nov 29 03:28:06 crc kubenswrapper[4749]: I1129 03:28:06.803693 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:28:06 crc kubenswrapper[4749]: I1129 03:28:06.965354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-catalog-content\") pod \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " Nov 29 03:28:06 crc kubenswrapper[4749]: I1129 03:28:06.965567 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76jzm\" (UniqueName: \"kubernetes.io/projected/2ad0b0af-7ceb-4a35-a463-c6968821a17d-kube-api-access-76jzm\") pod \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " Nov 29 03:28:06 crc kubenswrapper[4749]: I1129 03:28:06.965780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-utilities\") pod \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\" (UID: \"2ad0b0af-7ceb-4a35-a463-c6968821a17d\") " Nov 29 03:28:06 crc kubenswrapper[4749]: I1129 03:28:06.966649 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-utilities" (OuterVolumeSpecName: "utilities") pod "2ad0b0af-7ceb-4a35-a463-c6968821a17d" (UID: "2ad0b0af-7ceb-4a35-a463-c6968821a17d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:28:06 crc kubenswrapper[4749]: I1129 03:28:06.967278 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:28:06 crc kubenswrapper[4749]: I1129 03:28:06.975656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ad0b0af-7ceb-4a35-a463-c6968821a17d-kube-api-access-76jzm" (OuterVolumeSpecName: "kube-api-access-76jzm") pod "2ad0b0af-7ceb-4a35-a463-c6968821a17d" (UID: "2ad0b0af-7ceb-4a35-a463-c6968821a17d"). InnerVolumeSpecName "kube-api-access-76jzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.068894 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76jzm\" (UniqueName: \"kubernetes.io/projected/2ad0b0af-7ceb-4a35-a463-c6968821a17d-kube-api-access-76jzm\") on node \"crc\" DevicePath \"\"" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.289783 4749 generic.go:334] "Generic (PLEG): container finished" podID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerID="b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b" exitCode=0 Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.289840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd9rv" event={"ID":"2ad0b0af-7ceb-4a35-a463-c6968821a17d","Type":"ContainerDied","Data":"b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b"} Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.289881 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kd9rv" event={"ID":"2ad0b0af-7ceb-4a35-a463-c6968821a17d","Type":"ContainerDied","Data":"a27352af1f584a3b8a0de6ed71c27afe7a5ec6592742b2df2fad08f6ac7fc43a"} Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.289915 4749 scope.go:117] "RemoveContainer" containerID="b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.290118 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kd9rv" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.318857 4749 scope.go:117] "RemoveContainer" containerID="96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.377396 4749 scope.go:117] "RemoveContainer" containerID="220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.382682 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ad0b0af-7ceb-4a35-a463-c6968821a17d" (UID: "2ad0b0af-7ceb-4a35-a463-c6968821a17d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.408678 4749 scope.go:117] "RemoveContainer" containerID="b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b" Nov 29 03:28:07 crc kubenswrapper[4749]: E1129 03:28:07.409547 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b\": container with ID starting with b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b not found: ID does not exist" containerID="b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.409638 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b"} err="failed to get container status \"b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b\": rpc error: code = NotFound desc = could not find container \"b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b\": container with ID starting with b30a7f3834ef14176df9da258bca3254184ae3b0470b28549bd5767c1dbd361b not found: ID does not exist" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.409682 4749 scope.go:117] "RemoveContainer" containerID="96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc" Nov 29 03:28:07 crc kubenswrapper[4749]: E1129 03:28:07.410328 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc\": container with ID starting with 96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc not found: ID does not exist" containerID="96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.410389 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc"} err="failed to get container status \"96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc\": rpc error: code = NotFound desc = could not find container \"96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc\": container with ID starting with 96d426f6677f2fa6535d93e0913c5b7cc150164358635b42c3007d3541758dcc not found: ID does not exist" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.410433 4749 scope.go:117] "RemoveContainer" containerID="220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88" Nov 29 03:28:07 crc kubenswrapper[4749]: E1129 03:28:07.410992 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88\": container with ID starting with 220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88 not found: ID does not exist" containerID="220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.411022 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88"} err="failed to get container status \"220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88\": rpc error: code = NotFound desc = could not find container \"220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88\": container with ID starting with 220bba36b9bfe9d902a80e014f710e6d32c22e70c5d5bd8cc92858310d587d88 not found: ID does not exist" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.481857 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ad0b0af-7ceb-4a35-a463-c6968821a17d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.654088 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kd9rv"] Nov 29 03:28:07 crc kubenswrapper[4749]: I1129 03:28:07.654238 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kd9rv"] Nov 29 03:28:09 crc kubenswrapper[4749]: I1129 03:28:09.097921 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" path="/var/lib/kubelet/pods/2ad0b0af-7ceb-4a35-a463-c6968821a17d/volumes" Nov 29 03:28:25 crc kubenswrapper[4749]: I1129 03:28:25.374512 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:28:25 crc kubenswrapper[4749]: I1129 03:28:25.375056 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.374110 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.374826 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.374871 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.375588 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c88e8c91ecb9b62e36cb8ec0f16c22f036b3b08638e6ba452be1d393633a6642"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.375655 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://c88e8c91ecb9b62e36cb8ec0f16c22f036b3b08638e6ba452be1d393633a6642" gracePeriod=600 Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.918046 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="c88e8c91ecb9b62e36cb8ec0f16c22f036b3b08638e6ba452be1d393633a6642" exitCode=0 Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.918172 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"c88e8c91ecb9b62e36cb8ec0f16c22f036b3b08638e6ba452be1d393633a6642"} Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.918597 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6"} Nov 29 03:28:55 crc kubenswrapper[4749]: I1129 03:28:55.918618 4749 scope.go:117] "RemoveContainer" containerID="48bd4e009e95e769f6209721bf605386b678ad808114ff81d07fdc2b2efa58e0" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.164997 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp"] Nov 29 03:30:00 crc kubenswrapper[4749]: E1129 03:30:00.165965 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerName="extract-utilities" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.165984 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerName="extract-utilities" Nov 29 03:30:00 crc kubenswrapper[4749]: E1129 03:30:00.166013 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerName="extract-content" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.166022 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerName="extract-content" Nov 29 03:30:00 crc kubenswrapper[4749]: E1129 03:30:00.166050 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerName="registry-server" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.166058 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerName="registry-server" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.166341 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ad0b0af-7ceb-4a35-a463-c6968821a17d" containerName="registry-server" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.167292 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.170628 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.170998 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.175989 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp"] Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.332352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e6fa0-b00e-427e-a707-ff4498862473-config-volume\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.332432 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e6fa0-b00e-427e-a707-ff4498862473-secret-volume\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.332508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5cmp\" (UniqueName: \"kubernetes.io/projected/be5e6fa0-b00e-427e-a707-ff4498862473-kube-api-access-s5cmp\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.434906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5cmp\" (UniqueName: \"kubernetes.io/projected/be5e6fa0-b00e-427e-a707-ff4498862473-kube-api-access-s5cmp\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.435727 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e6fa0-b00e-427e-a707-ff4498862473-config-volume\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.435804 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e6fa0-b00e-427e-a707-ff4498862473-secret-volume\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.437235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e6fa0-b00e-427e-a707-ff4498862473-config-volume\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.442629 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e6fa0-b00e-427e-a707-ff4498862473-secret-volume\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.452615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5cmp\" (UniqueName: \"kubernetes.io/projected/be5e6fa0-b00e-427e-a707-ff4498862473-kube-api-access-s5cmp\") pod \"collect-profiles-29406450-zkwwp\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:00 crc kubenswrapper[4749]: I1129 03:30:00.525091 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:01 crc kubenswrapper[4749]: I1129 03:30:01.009471 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp"] Nov 29 03:30:01 crc kubenswrapper[4749]: I1129 03:30:01.859482 4749 generic.go:334] "Generic (PLEG): container finished" podID="ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" containerID="f6c72aa1f3236c8c68998826d342d9a6c96d302fee861b2990b4d01fbe1eee6b" exitCode=0 Nov 29 03:30:01 crc kubenswrapper[4749]: I1129 03:30:01.859669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" event={"ID":"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec","Type":"ContainerDied","Data":"f6c72aa1f3236c8c68998826d342d9a6c96d302fee861b2990b4d01fbe1eee6b"} Nov 29 03:30:01 crc kubenswrapper[4749]: I1129 03:30:01.862540 4749 generic.go:334] "Generic (PLEG): container finished" podID="be5e6fa0-b00e-427e-a707-ff4498862473" containerID="4a6cc072372f9e3435a11d719a5400df7fa07ebf0c9cbc487944a1ad55f75e46" exitCode=0 Nov 29 03:30:01 crc kubenswrapper[4749]: I1129 03:30:01.862594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" event={"ID":"be5e6fa0-b00e-427e-a707-ff4498862473","Type":"ContainerDied","Data":"4a6cc072372f9e3435a11d719a5400df7fa07ebf0c9cbc487944a1ad55f75e46"} Nov 29 03:30:01 crc kubenswrapper[4749]: I1129 03:30:01.862627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" event={"ID":"be5e6fa0-b00e-427e-a707-ff4498862473","Type":"ContainerStarted","Data":"cddb1ed176acb48aa36c2bf572f55c4931dd456ad8cb92a35bc160d685cc7787"} Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.303890 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.407798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e6fa0-b00e-427e-a707-ff4498862473-secret-volume\") pod \"be5e6fa0-b00e-427e-a707-ff4498862473\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.408314 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5cmp\" (UniqueName: \"kubernetes.io/projected/be5e6fa0-b00e-427e-a707-ff4498862473-kube-api-access-s5cmp\") pod \"be5e6fa0-b00e-427e-a707-ff4498862473\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.408441 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e6fa0-b00e-427e-a707-ff4498862473-config-volume\") pod \"be5e6fa0-b00e-427e-a707-ff4498862473\" (UID: \"be5e6fa0-b00e-427e-a707-ff4498862473\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.409425 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5e6fa0-b00e-427e-a707-ff4498862473-config-volume" (OuterVolumeSpecName: "config-volume") pod "be5e6fa0-b00e-427e-a707-ff4498862473" (UID: "be5e6fa0-b00e-427e-a707-ff4498862473"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.413412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be5e6fa0-b00e-427e-a707-ff4498862473-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be5e6fa0-b00e-427e-a707-ff4498862473" (UID: "be5e6fa0-b00e-427e-a707-ff4498862473"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.414030 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5e6fa0-b00e-427e-a707-ff4498862473-kube-api-access-s5cmp" (OuterVolumeSpecName: "kube-api-access-s5cmp") pod "be5e6fa0-b00e-427e-a707-ff4498862473" (UID: "be5e6fa0-b00e-427e-a707-ff4498862473"). InnerVolumeSpecName "kube-api-access-s5cmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.479720 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.511764 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5cmp\" (UniqueName: \"kubernetes.io/projected/be5e6fa0-b00e-427e-a707-ff4498862473-kube-api-access-s5cmp\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.511828 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be5e6fa0-b00e-427e-a707-ff4498862473-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.511845 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be5e6fa0-b00e-427e-a707-ff4498862473-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.613242 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9fs4\" (UniqueName: \"kubernetes.io/projected/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-kube-api-access-d9fs4\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.613327 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ceph\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.613404 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-0\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.614245 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-1\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.614280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-1\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.614311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-0\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.614338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-1\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.614696 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ssh-key\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.614727 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-0\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.614781 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-inventory\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.614866 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-combined-ca-bundle\") pod \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\" (UID: \"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec\") " Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.617643 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ceph" (OuterVolumeSpecName: "ceph") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.618714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.630249 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-kube-api-access-d9fs4" (OuterVolumeSpecName: "kube-api-access-d9fs4") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "kube-api-access-d9fs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.646145 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.662277 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.662797 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-inventory" (OuterVolumeSpecName: "inventory") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.662925 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.666856 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.673387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.684549 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.691389 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" (UID: "ec28e2f8-bbc8-4808-8f15-e314e66ef4ec"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.717958 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9fs4\" (UniqueName: \"kubernetes.io/projected/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-kube-api-access-d9fs4\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718007 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718021 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718033 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718048 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718060 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718071 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718082 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718092 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718103 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.718115 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec28e2f8-bbc8-4808-8f15-e314e66ef4ec-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.886484 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.886476 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406450-zkwwp" event={"ID":"be5e6fa0-b00e-427e-a707-ff4498862473","Type":"ContainerDied","Data":"cddb1ed176acb48aa36c2bf572f55c4931dd456ad8cb92a35bc160d685cc7787"} Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.886926 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cddb1ed176acb48aa36c2bf572f55c4931dd456ad8cb92a35bc160d685cc7787" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.889042 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" event={"ID":"ec28e2f8-bbc8-4808-8f15-e314e66ef4ec","Type":"ContainerDied","Data":"11efa8de70fce3bbf9366ef73bd09eaf548e6d551848d4725e7c8c8475d28d9d"} Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.889065 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11efa8de70fce3bbf9366ef73bd09eaf548e6d551848d4725e7c8c8475d28d9d" Nov 29 03:30:03 crc kubenswrapper[4749]: I1129 03:30:03.889152 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-wv7hb" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.004553 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-npfts"] Nov 29 03:30:04 crc kubenswrapper[4749]: E1129 03:30:04.005055 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5e6fa0-b00e-427e-a707-ff4498862473" containerName="collect-profiles" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.005075 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5e6fa0-b00e-427e-a707-ff4498862473" containerName="collect-profiles" Nov 29 03:30:04 crc kubenswrapper[4749]: E1129 03:30:04.005109 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" containerName="nova-cell1-openstack-openstack-cell1" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.005119 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" containerName="nova-cell1-openstack-openstack-cell1" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.005439 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5e6fa0-b00e-427e-a707-ff4498862473" containerName="collect-profiles" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.005472 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec28e2f8-bbc8-4808-8f15-e314e66ef4ec" containerName="nova-cell1-openstack-openstack-cell1" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.006408 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.008921 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.008917 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.011430 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.011843 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.013879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.027810 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-npfts"] Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.127725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkl8l\" (UniqueName: \"kubernetes.io/projected/0b460379-aed8-40bb-b56e-f20fc64761bf-kube-api-access-zkl8l\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.128031 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-inventory\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.128233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.128280 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ssh-key\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.128352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.128518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.128605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceph\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.128845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.230493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-inventory\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.230577 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.230605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ssh-key\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.230642 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.230740 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.230808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceph\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.230958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.230993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkl8l\" (UniqueName: \"kubernetes.io/projected/0b460379-aed8-40bb-b56e-f20fc64761bf-kube-api-access-zkl8l\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.235577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.235847 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceph\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.235848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.236129 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.236147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ssh-key\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.236716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.240498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-inventory\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.269316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkl8l\" (UniqueName: \"kubernetes.io/projected/0b460379-aed8-40bb-b56e-f20fc64761bf-kube-api-access-zkl8l\") pod \"telemetry-openstack-openstack-cell1-npfts\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.371903 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj"] Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.380533 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406405-gz8rj"] Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.384729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.939385 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-npfts"] Nov 29 03:30:04 crc kubenswrapper[4749]: I1129 03:30:04.941989 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:30:05 crc kubenswrapper[4749]: I1129 03:30:05.088484 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f0604d-7a7b-4d66-8a77-f389c7a7d406" path="/var/lib/kubelet/pods/f7f0604d-7a7b-4d66-8a77-f389c7a7d406/volumes" Nov 29 03:30:05 crc kubenswrapper[4749]: I1129 03:30:05.912883 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-npfts" event={"ID":"0b460379-aed8-40bb-b56e-f20fc64761bf","Type":"ContainerStarted","Data":"5496046d2cd7fe749b2e06bfd61042b966ec3bae4948aa39f180c5def3c9a69e"} Nov 29 03:30:06 crc kubenswrapper[4749]: I1129 03:30:06.928728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-npfts" event={"ID":"0b460379-aed8-40bb-b56e-f20fc64761bf","Type":"ContainerStarted","Data":"4b20e364099614a551dfbdb8da7bcd447c1e48785830a3498d3a6855479db51d"} Nov 29 03:30:06 crc kubenswrapper[4749]: I1129 03:30:06.964151 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-npfts" podStartSLOduration=3.269101311 podStartE2EDuration="3.964123659s" podCreationTimestamp="2025-11-29 03:30:03 +0000 UTC" firstStartedPulling="2025-11-29 03:30:04.94177913 +0000 UTC m=+8348.113928987" lastFinishedPulling="2025-11-29 03:30:05.636801438 +0000 UTC m=+8348.808951335" observedRunningTime="2025-11-29 03:30:06.954912035 +0000 UTC m=+8350.127061932" watchObservedRunningTime="2025-11-29 03:30:06.964123659 +0000 UTC m=+8350.136273556" Nov 29 03:30:26 crc kubenswrapper[4749]: I1129 03:30:26.796248 4749 scope.go:117] "RemoveContainer" containerID="5744420dc8629deaf306a5678b4b436247af9eae850f8d1d3753e13a31600aad" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.690911 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f5xct"] Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.699040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.729247 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5xct"] Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.885750 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-catalog-content\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.885807 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-utilities\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.886073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2jf\" (UniqueName: \"kubernetes.io/projected/39ef1015-ccf1-49b1-8648-829338b4f4b2-kube-api-access-6m2jf\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.988392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-catalog-content\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.988443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-utilities\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.988552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2jf\" (UniqueName: \"kubernetes.io/projected/39ef1015-ccf1-49b1-8648-829338b4f4b2-kube-api-access-6m2jf\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.988977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-catalog-content\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:34 crc kubenswrapper[4749]: I1129 03:30:34.989107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-utilities\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:35 crc kubenswrapper[4749]: I1129 03:30:35.008155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2jf\" (UniqueName: \"kubernetes.io/projected/39ef1015-ccf1-49b1-8648-829338b4f4b2-kube-api-access-6m2jf\") pod \"redhat-marketplace-f5xct\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:35 crc kubenswrapper[4749]: I1129 03:30:35.021872 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:35 crc kubenswrapper[4749]: W1129 03:30:35.570471 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39ef1015_ccf1_49b1_8648_829338b4f4b2.slice/crio-1ee452950260461bf6b5ba0d0c67370142e74f472ff76b500093c491430088e8 WatchSource:0}: Error finding container 1ee452950260461bf6b5ba0d0c67370142e74f472ff76b500093c491430088e8: Status 404 returned error can't find the container with id 1ee452950260461bf6b5ba0d0c67370142e74f472ff76b500093c491430088e8 Nov 29 03:30:35 crc kubenswrapper[4749]: I1129 03:30:35.579456 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5xct"] Nov 29 03:30:36 crc kubenswrapper[4749]: E1129 03:30:36.030376 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39ef1015_ccf1_49b1_8648_829338b4f4b2.slice/crio-conmon-6eac411da97e1ef72e6d427cff8059c3c6b961abb9d0d74714ab0bcfd7865f6d.scope\": RecentStats: unable to find data in memory cache]" Nov 29 03:30:36 crc kubenswrapper[4749]: I1129 03:30:36.251891 4749 generic.go:334] "Generic (PLEG): container finished" podID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerID="6eac411da97e1ef72e6d427cff8059c3c6b961abb9d0d74714ab0bcfd7865f6d" exitCode=0 Nov 29 03:30:36 crc kubenswrapper[4749]: I1129 03:30:36.251950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5xct" event={"ID":"39ef1015-ccf1-49b1-8648-829338b4f4b2","Type":"ContainerDied","Data":"6eac411da97e1ef72e6d427cff8059c3c6b961abb9d0d74714ab0bcfd7865f6d"} Nov 29 03:30:36 crc kubenswrapper[4749]: I1129 03:30:36.252249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5xct" event={"ID":"39ef1015-ccf1-49b1-8648-829338b4f4b2","Type":"ContainerStarted","Data":"1ee452950260461bf6b5ba0d0c67370142e74f472ff76b500093c491430088e8"} Nov 29 03:30:38 crc kubenswrapper[4749]: I1129 03:30:38.298656 4749 generic.go:334] "Generic (PLEG): container finished" podID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerID="5644c5084e0f00932e2a4812915720d5fa30298771faed009827248eb4d28168" exitCode=0 Nov 29 03:30:38 crc kubenswrapper[4749]: I1129 03:30:38.300356 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5xct" event={"ID":"39ef1015-ccf1-49b1-8648-829338b4f4b2","Type":"ContainerDied","Data":"5644c5084e0f00932e2a4812915720d5fa30298771faed009827248eb4d28168"} Nov 29 03:30:39 crc kubenswrapper[4749]: I1129 03:30:39.312872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5xct" event={"ID":"39ef1015-ccf1-49b1-8648-829338b4f4b2","Type":"ContainerStarted","Data":"289380718c40d22cd49cfde1a1fa1828a6882829b70207b9433f0771d46a2d92"} Nov 29 03:30:39 crc kubenswrapper[4749]: I1129 03:30:39.342074 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f5xct" podStartSLOduration=2.867909603 podStartE2EDuration="5.342054884s" podCreationTimestamp="2025-11-29 03:30:34 +0000 UTC" firstStartedPulling="2025-11-29 03:30:36.254140632 +0000 UTC m=+8379.426290499" lastFinishedPulling="2025-11-29 03:30:38.728285923 +0000 UTC m=+8381.900435780" observedRunningTime="2025-11-29 03:30:39.333820434 +0000 UTC m=+8382.505970291" watchObservedRunningTime="2025-11-29 03:30:39.342054884 +0000 UTC m=+8382.514204741" Nov 29 03:30:45 crc kubenswrapper[4749]: I1129 03:30:45.022681 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:45 crc kubenswrapper[4749]: I1129 03:30:45.023398 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:45 crc kubenswrapper[4749]: I1129 03:30:45.095263 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:45 crc kubenswrapper[4749]: I1129 03:30:45.644769 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:45 crc kubenswrapper[4749]: I1129 03:30:45.724444 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5xct"] Nov 29 03:30:47 crc kubenswrapper[4749]: I1129 03:30:47.596522 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f5xct" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerName="registry-server" containerID="cri-o://289380718c40d22cd49cfde1a1fa1828a6882829b70207b9433f0771d46a2d92" gracePeriod=2 Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.609563 4749 generic.go:334] "Generic (PLEG): container finished" podID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerID="289380718c40d22cd49cfde1a1fa1828a6882829b70207b9433f0771d46a2d92" exitCode=0 Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.609715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5xct" event={"ID":"39ef1015-ccf1-49b1-8648-829338b4f4b2","Type":"ContainerDied","Data":"289380718c40d22cd49cfde1a1fa1828a6882829b70207b9433f0771d46a2d92"} Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.609901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5xct" event={"ID":"39ef1015-ccf1-49b1-8648-829338b4f4b2","Type":"ContainerDied","Data":"1ee452950260461bf6b5ba0d0c67370142e74f472ff76b500093c491430088e8"} Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.609916 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee452950260461bf6b5ba0d0c67370142e74f472ff76b500093c491430088e8" Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.647280 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.806593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-catalog-content\") pod \"39ef1015-ccf1-49b1-8648-829338b4f4b2\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.806721 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-utilities\") pod \"39ef1015-ccf1-49b1-8648-829338b4f4b2\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.806744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m2jf\" (UniqueName: \"kubernetes.io/projected/39ef1015-ccf1-49b1-8648-829338b4f4b2-kube-api-access-6m2jf\") pod \"39ef1015-ccf1-49b1-8648-829338b4f4b2\" (UID: \"39ef1015-ccf1-49b1-8648-829338b4f4b2\") " Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.807681 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-utilities" (OuterVolumeSpecName: "utilities") pod "39ef1015-ccf1-49b1-8648-829338b4f4b2" (UID: "39ef1015-ccf1-49b1-8648-829338b4f4b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.812959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ef1015-ccf1-49b1-8648-829338b4f4b2-kube-api-access-6m2jf" (OuterVolumeSpecName: "kube-api-access-6m2jf") pod "39ef1015-ccf1-49b1-8648-829338b4f4b2" (UID: "39ef1015-ccf1-49b1-8648-829338b4f4b2"). InnerVolumeSpecName "kube-api-access-6m2jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.829735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39ef1015-ccf1-49b1-8648-829338b4f4b2" (UID: "39ef1015-ccf1-49b1-8648-829338b4f4b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.909272 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.909306 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39ef1015-ccf1-49b1-8648-829338b4f4b2-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:48 crc kubenswrapper[4749]: I1129 03:30:48.909318 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m2jf\" (UniqueName: \"kubernetes.io/projected/39ef1015-ccf1-49b1-8648-829338b4f4b2-kube-api-access-6m2jf\") on node \"crc\" DevicePath \"\"" Nov 29 03:30:49 crc kubenswrapper[4749]: I1129 03:30:49.621394 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5xct" Nov 29 03:30:49 crc kubenswrapper[4749]: I1129 03:30:49.657710 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5xct"] Nov 29 03:30:49 crc kubenswrapper[4749]: I1129 03:30:49.673455 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5xct"] Nov 29 03:30:51 crc kubenswrapper[4749]: I1129 03:30:51.094072 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" path="/var/lib/kubelet/pods/39ef1015-ccf1-49b1-8648-829338b4f4b2/volumes" Nov 29 03:30:55 crc kubenswrapper[4749]: I1129 03:30:55.373914 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:30:55 crc kubenswrapper[4749]: I1129 03:30:55.374367 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:31:25 crc kubenswrapper[4749]: I1129 03:31:25.374177 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:31:25 crc kubenswrapper[4749]: I1129 03:31:25.374723 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.585941 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nwdvw"] Nov 29 03:31:37 crc kubenswrapper[4749]: E1129 03:31:37.587475 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerName="registry-server" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.587509 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerName="registry-server" Nov 29 03:31:37 crc kubenswrapper[4749]: E1129 03:31:37.587541 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerName="extract-utilities" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.587560 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerName="extract-utilities" Nov 29 03:31:37 crc kubenswrapper[4749]: E1129 03:31:37.587600 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerName="extract-content" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.587620 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerName="extract-content" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.588290 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ef1015-ccf1-49b1-8648-829338b4f4b2" containerName="registry-server" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.593944 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.608005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwdvw"] Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.638275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmkxr\" (UniqueName: \"kubernetes.io/projected/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-kube-api-access-kmkxr\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.638549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-catalog-content\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.639109 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-utilities\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.741387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-utilities\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.741603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmkxr\" (UniqueName: \"kubernetes.io/projected/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-kube-api-access-kmkxr\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.741645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-catalog-content\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.741842 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-utilities\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.742101 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-catalog-content\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.770876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmkxr\" (UniqueName: \"kubernetes.io/projected/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-kube-api-access-kmkxr\") pod \"redhat-operators-nwdvw\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:37 crc kubenswrapper[4749]: I1129 03:31:37.935791 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:38 crc kubenswrapper[4749]: I1129 03:31:38.450575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nwdvw"] Nov 29 03:31:39 crc kubenswrapper[4749]: I1129 03:31:39.241742 4749 generic.go:334] "Generic (PLEG): container finished" podID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerID="762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545" exitCode=0 Nov 29 03:31:39 crc kubenswrapper[4749]: I1129 03:31:39.241878 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwdvw" event={"ID":"9aff0bcb-39d0-4ceb-8c06-41e563169ae1","Type":"ContainerDied","Data":"762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545"} Nov 29 03:31:39 crc kubenswrapper[4749]: I1129 03:31:39.242118 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwdvw" event={"ID":"9aff0bcb-39d0-4ceb-8c06-41e563169ae1","Type":"ContainerStarted","Data":"41bc62a831e1398fcdc28834e9130081627db43a8c65d73a3214f3887de3079d"} Nov 29 03:31:40 crc kubenswrapper[4749]: I1129 03:31:40.258607 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwdvw" event={"ID":"9aff0bcb-39d0-4ceb-8c06-41e563169ae1","Type":"ContainerStarted","Data":"e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d"} Nov 29 03:31:44 crc kubenswrapper[4749]: I1129 03:31:44.310948 4749 generic.go:334] "Generic (PLEG): container finished" podID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerID="e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d" exitCode=0 Nov 29 03:31:44 crc kubenswrapper[4749]: I1129 03:31:44.311469 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwdvw" event={"ID":"9aff0bcb-39d0-4ceb-8c06-41e563169ae1","Type":"ContainerDied","Data":"e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d"} Nov 29 03:31:45 crc kubenswrapper[4749]: I1129 03:31:45.328386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwdvw" event={"ID":"9aff0bcb-39d0-4ceb-8c06-41e563169ae1","Type":"ContainerStarted","Data":"195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d"} Nov 29 03:31:45 crc kubenswrapper[4749]: I1129 03:31:45.368809 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nwdvw" podStartSLOduration=2.850283459 podStartE2EDuration="8.368783812s" podCreationTimestamp="2025-11-29 03:31:37 +0000 UTC" firstStartedPulling="2025-11-29 03:31:39.244157933 +0000 UTC m=+8442.416307800" lastFinishedPulling="2025-11-29 03:31:44.762658296 +0000 UTC m=+8447.934808153" observedRunningTime="2025-11-29 03:31:45.356821241 +0000 UTC m=+8448.528971108" watchObservedRunningTime="2025-11-29 03:31:45.368783812 +0000 UTC m=+8448.540933679" Nov 29 03:31:47 crc kubenswrapper[4749]: I1129 03:31:47.936282 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:47 crc kubenswrapper[4749]: I1129 03:31:47.938013 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:48 crc kubenswrapper[4749]: I1129 03:31:48.998303 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nwdvw" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="registry-server" probeResult="failure" output=< Nov 29 03:31:48 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 03:31:48 crc kubenswrapper[4749]: > Nov 29 03:31:55 crc kubenswrapper[4749]: I1129 03:31:55.374423 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:31:55 crc kubenswrapper[4749]: I1129 03:31:55.375031 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:31:55 crc kubenswrapper[4749]: I1129 03:31:55.375102 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:31:55 crc kubenswrapper[4749]: I1129 03:31:55.376284 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:31:55 crc kubenswrapper[4749]: I1129 03:31:55.376415 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" gracePeriod=600 Nov 29 03:31:55 crc kubenswrapper[4749]: E1129 03:31:55.511071 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:31:56 crc kubenswrapper[4749]: I1129 03:31:56.449984 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" exitCode=0 Nov 29 03:31:56 crc kubenswrapper[4749]: I1129 03:31:56.450031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6"} Nov 29 03:31:56 crc kubenswrapper[4749]: I1129 03:31:56.451364 4749 scope.go:117] "RemoveContainer" containerID="c88e8c91ecb9b62e36cb8ec0f16c22f036b3b08638e6ba452be1d393633a6642" Nov 29 03:31:56 crc kubenswrapper[4749]: I1129 03:31:56.452126 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:31:56 crc kubenswrapper[4749]: E1129 03:31:56.452603 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:31:57 crc kubenswrapper[4749]: I1129 03:31:57.984453 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:58 crc kubenswrapper[4749]: I1129 03:31:58.048848 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:31:58 crc kubenswrapper[4749]: I1129 03:31:58.225969 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwdvw"] Nov 29 03:31:59 crc kubenswrapper[4749]: I1129 03:31:59.489252 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nwdvw" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="registry-server" containerID="cri-o://195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d" gracePeriod=2 Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.061479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.097666 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-utilities\") pod \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.097777 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmkxr\" (UniqueName: \"kubernetes.io/projected/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-kube-api-access-kmkxr\") pod \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.098104 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-catalog-content\") pod \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\" (UID: \"9aff0bcb-39d0-4ceb-8c06-41e563169ae1\") " Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.099722 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-utilities" (OuterVolumeSpecName: "utilities") pod "9aff0bcb-39d0-4ceb-8c06-41e563169ae1" (UID: "9aff0bcb-39d0-4ceb-8c06-41e563169ae1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.112919 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-kube-api-access-kmkxr" (OuterVolumeSpecName: "kube-api-access-kmkxr") pod "9aff0bcb-39d0-4ceb-8c06-41e563169ae1" (UID: "9aff0bcb-39d0-4ceb-8c06-41e563169ae1"). InnerVolumeSpecName "kube-api-access-kmkxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.201368 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.201410 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmkxr\" (UniqueName: \"kubernetes.io/projected/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-kube-api-access-kmkxr\") on node \"crc\" DevicePath \"\"" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.219443 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9aff0bcb-39d0-4ceb-8c06-41e563169ae1" (UID: "9aff0bcb-39d0-4ceb-8c06-41e563169ae1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.303605 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aff0bcb-39d0-4ceb-8c06-41e563169ae1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.501886 4749 generic.go:334] "Generic (PLEG): container finished" podID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerID="195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d" exitCode=0 Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.501949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwdvw" event={"ID":"9aff0bcb-39d0-4ceb-8c06-41e563169ae1","Type":"ContainerDied","Data":"195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d"} Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.501975 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nwdvw" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.501999 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nwdvw" event={"ID":"9aff0bcb-39d0-4ceb-8c06-41e563169ae1","Type":"ContainerDied","Data":"41bc62a831e1398fcdc28834e9130081627db43a8c65d73a3214f3887de3079d"} Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.502032 4749 scope.go:117] "RemoveContainer" containerID="195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.526430 4749 scope.go:117] "RemoveContainer" containerID="e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.552079 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nwdvw"] Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.567324 4749 scope.go:117] "RemoveContainer" containerID="762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.571433 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nwdvw"] Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.620248 4749 scope.go:117] "RemoveContainer" containerID="195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d" Nov 29 03:32:00 crc kubenswrapper[4749]: E1129 03:32:00.620846 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d\": container with ID starting with 195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d not found: ID does not exist" containerID="195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.620935 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d"} err="failed to get container status \"195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d\": rpc error: code = NotFound desc = could not find container \"195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d\": container with ID starting with 195d18eb691383152dab1da139b993ea74e5c39d599ef91b86cef8ef80288d0d not found: ID does not exist" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.620990 4749 scope.go:117] "RemoveContainer" containerID="e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d" Nov 29 03:32:00 crc kubenswrapper[4749]: E1129 03:32:00.624688 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d\": container with ID starting with e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d not found: ID does not exist" containerID="e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.624737 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d"} err="failed to get container status \"e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d\": rpc error: code = NotFound desc = could not find container \"e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d\": container with ID starting with e382756b6d3ee735963028de82c71f4eaf0f4dd2b5b4a5e182ecaf1d335bad4d not found: ID does not exist" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.624768 4749 scope.go:117] "RemoveContainer" containerID="762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545" Nov 29 03:32:00 crc kubenswrapper[4749]: E1129 03:32:00.625250 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545\": container with ID starting with 762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545 not found: ID does not exist" containerID="762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545" Nov 29 03:32:00 crc kubenswrapper[4749]: I1129 03:32:00.625310 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545"} err="failed to get container status \"762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545\": rpc error: code = NotFound desc = could not find container \"762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545\": container with ID starting with 762d21f18f4df28c6153ef4a5a15af14c894768baf91122458e605178b1d3545 not found: ID does not exist" Nov 29 03:32:01 crc kubenswrapper[4749]: I1129 03:32:01.100312 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" path="/var/lib/kubelet/pods/9aff0bcb-39d0-4ceb-8c06-41e563169ae1/volumes" Nov 29 03:32:11 crc kubenswrapper[4749]: I1129 03:32:11.077600 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:32:11 crc kubenswrapper[4749]: E1129 03:32:11.078834 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:32:26 crc kubenswrapper[4749]: I1129 03:32:26.076168 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:32:26 crc kubenswrapper[4749]: E1129 03:32:26.077372 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:32:37 crc kubenswrapper[4749]: I1129 03:32:37.088005 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:32:37 crc kubenswrapper[4749]: E1129 03:32:37.088937 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:32:48 crc kubenswrapper[4749]: I1129 03:32:48.074791 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:32:48 crc kubenswrapper[4749]: E1129 03:32:48.075780 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:33:03 crc kubenswrapper[4749]: I1129 03:33:03.075070 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:33:03 crc kubenswrapper[4749]: E1129 03:33:03.075960 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:33:18 crc kubenswrapper[4749]: I1129 03:33:18.076342 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:33:18 crc kubenswrapper[4749]: E1129 03:33:18.077525 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:33:31 crc kubenswrapper[4749]: I1129 03:33:31.075766 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:33:31 crc kubenswrapper[4749]: E1129 03:33:31.080823 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:33:43 crc kubenswrapper[4749]: I1129 03:33:43.074982 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:33:43 crc kubenswrapper[4749]: E1129 03:33:43.075939 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:33:58 crc kubenswrapper[4749]: I1129 03:33:58.075259 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:33:58 crc kubenswrapper[4749]: E1129 03:33:58.076068 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:34:10 crc kubenswrapper[4749]: I1129 03:34:10.097349 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b460379-aed8-40bb-b56e-f20fc64761bf" containerID="4b20e364099614a551dfbdb8da7bcd447c1e48785830a3498d3a6855479db51d" exitCode=0 Nov 29 03:34:10 crc kubenswrapper[4749]: I1129 03:34:10.097512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-npfts" event={"ID":"0b460379-aed8-40bb-b56e-f20fc64761bf","Type":"ContainerDied","Data":"4b20e364099614a551dfbdb8da7bcd447c1e48785830a3498d3a6855479db51d"} Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.719269 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.785873 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-inventory\") pod \"0b460379-aed8-40bb-b56e-f20fc64761bf\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.785943 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-2\") pod \"0b460379-aed8-40bb-b56e-f20fc64761bf\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.786007 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-telemetry-combined-ca-bundle\") pod \"0b460379-aed8-40bb-b56e-f20fc64761bf\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.786077 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkl8l\" (UniqueName: \"kubernetes.io/projected/0b460379-aed8-40bb-b56e-f20fc64761bf-kube-api-access-zkl8l\") pod \"0b460379-aed8-40bb-b56e-f20fc64761bf\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.786181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-1\") pod \"0b460379-aed8-40bb-b56e-f20fc64761bf\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.786229 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-0\") pod \"0b460379-aed8-40bb-b56e-f20fc64761bf\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.786286 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceph\") pod \"0b460379-aed8-40bb-b56e-f20fc64761bf\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.786308 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ssh-key\") pod \"0b460379-aed8-40bb-b56e-f20fc64761bf\" (UID: \"0b460379-aed8-40bb-b56e-f20fc64761bf\") " Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.793371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b460379-aed8-40bb-b56e-f20fc64761bf-kube-api-access-zkl8l" (OuterVolumeSpecName: "kube-api-access-zkl8l") pod "0b460379-aed8-40bb-b56e-f20fc64761bf" (UID: "0b460379-aed8-40bb-b56e-f20fc64761bf"). InnerVolumeSpecName "kube-api-access-zkl8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.793507 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceph" (OuterVolumeSpecName: "ceph") pod "0b460379-aed8-40bb-b56e-f20fc64761bf" (UID: "0b460379-aed8-40bb-b56e-f20fc64761bf"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.793802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0b460379-aed8-40bb-b56e-f20fc64761bf" (UID: "0b460379-aed8-40bb-b56e-f20fc64761bf"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.825452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "0b460379-aed8-40bb-b56e-f20fc64761bf" (UID: "0b460379-aed8-40bb-b56e-f20fc64761bf"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.825732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "0b460379-aed8-40bb-b56e-f20fc64761bf" (UID: "0b460379-aed8-40bb-b56e-f20fc64761bf"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.836110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "0b460379-aed8-40bb-b56e-f20fc64761bf" (UID: "0b460379-aed8-40bb-b56e-f20fc64761bf"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.847162 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-inventory" (OuterVolumeSpecName: "inventory") pod "0b460379-aed8-40bb-b56e-f20fc64761bf" (UID: "0b460379-aed8-40bb-b56e-f20fc64761bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.848930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b460379-aed8-40bb-b56e-f20fc64761bf" (UID: "0b460379-aed8-40bb-b56e-f20fc64761bf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.890277 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.890316 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.890332 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.890346 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.890358 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.890370 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.890382 4749 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b460379-aed8-40bb-b56e-f20fc64761bf-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:11 crc kubenswrapper[4749]: I1129 03:34:11.890394 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkl8l\" (UniqueName: \"kubernetes.io/projected/0b460379-aed8-40bb-b56e-f20fc64761bf-kube-api-access-zkl8l\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.122200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-npfts" event={"ID":"0b460379-aed8-40bb-b56e-f20fc64761bf","Type":"ContainerDied","Data":"5496046d2cd7fe749b2e06bfd61042b966ec3bae4948aa39f180c5def3c9a69e"} Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.122264 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5496046d2cd7fe749b2e06bfd61042b966ec3bae4948aa39f180c5def3c9a69e" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.122315 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-npfts" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.261599 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gnwc6"] Nov 29 03:34:12 crc kubenswrapper[4749]: E1129 03:34:12.262066 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b460379-aed8-40bb-b56e-f20fc64761bf" containerName="telemetry-openstack-openstack-cell1" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.262088 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b460379-aed8-40bb-b56e-f20fc64761bf" containerName="telemetry-openstack-openstack-cell1" Nov 29 03:34:12 crc kubenswrapper[4749]: E1129 03:34:12.262107 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="registry-server" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.262113 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="registry-server" Nov 29 03:34:12 crc kubenswrapper[4749]: E1129 03:34:12.262130 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="extract-content" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.262137 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="extract-content" Nov 29 03:34:12 crc kubenswrapper[4749]: E1129 03:34:12.262155 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="extract-utilities" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.262162 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="extract-utilities" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.262404 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b460379-aed8-40bb-b56e-f20fc64761bf" containerName="telemetry-openstack-openstack-cell1" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.262418 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aff0bcb-39d0-4ceb-8c06-41e563169ae1" containerName="registry-server" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.263121 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.268774 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.269057 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.269428 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.269602 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.281097 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gnwc6"] Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.281745 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.298418 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcvf\" (UniqueName: \"kubernetes.io/projected/8e10fa10-d91b-497f-801e-2b6093ebdb8d-kube-api-access-vbcvf\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.298614 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.298654 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.298814 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.298868 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.298977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: E1129 03:34:12.381668 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b460379_aed8_40bb_b56e_f20fc64761bf.slice/crio-5496046d2cd7fe749b2e06bfd61042b966ec3bae4948aa39f180c5def3c9a69e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b460379_aed8_40bb_b56e_f20fc64761bf.slice\": RecentStats: unable to find data in memory cache]" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.406656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.406767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.406819 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcvf\" (UniqueName: \"kubernetes.io/projected/8e10fa10-d91b-497f-801e-2b6093ebdb8d-kube-api-access-vbcvf\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.406927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.406960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.407084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.412815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.417186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.419497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.421678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.427169 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.427966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcvf\" (UniqueName: \"kubernetes.io/projected/8e10fa10-d91b-497f-801e-2b6093ebdb8d-kube-api-access-vbcvf\") pod \"neutron-sriov-openstack-openstack-cell1-gnwc6\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:12 crc kubenswrapper[4749]: I1129 03:34:12.594670 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:34:13 crc kubenswrapper[4749]: I1129 03:34:13.076944 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:34:13 crc kubenswrapper[4749]: E1129 03:34:13.078003 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:34:13 crc kubenswrapper[4749]: I1129 03:34:13.219506 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-gnwc6"] Nov 29 03:34:13 crc kubenswrapper[4749]: W1129 03:34:13.225885 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e10fa10_d91b_497f_801e_2b6093ebdb8d.slice/crio-c9cd25b0950eea51db379344854daf3edc8d8ec1c462eec9bdd9cb15c9ba4618 WatchSource:0}: Error finding container c9cd25b0950eea51db379344854daf3edc8d8ec1c462eec9bdd9cb15c9ba4618: Status 404 returned error can't find the container with id c9cd25b0950eea51db379344854daf3edc8d8ec1c462eec9bdd9cb15c9ba4618 Nov 29 03:34:14 crc kubenswrapper[4749]: I1129 03:34:14.154910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" event={"ID":"8e10fa10-d91b-497f-801e-2b6093ebdb8d","Type":"ContainerStarted","Data":"8efc02866887fe32a98d3619b22cdf1398ddbe8faba7d52b9fe64e23ade5c306"} Nov 29 03:34:14 crc kubenswrapper[4749]: I1129 03:34:14.155439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" event={"ID":"8e10fa10-d91b-497f-801e-2b6093ebdb8d","Type":"ContainerStarted","Data":"c9cd25b0950eea51db379344854daf3edc8d8ec1c462eec9bdd9cb15c9ba4618"} Nov 29 03:34:14 crc kubenswrapper[4749]: I1129 03:34:14.193857 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" podStartSLOduration=1.605778199 podStartE2EDuration="2.193836576s" podCreationTimestamp="2025-11-29 03:34:12 +0000 UTC" firstStartedPulling="2025-11-29 03:34:13.229463455 +0000 UTC m=+8596.401613312" lastFinishedPulling="2025-11-29 03:34:13.817521832 +0000 UTC m=+8596.989671689" observedRunningTime="2025-11-29 03:34:14.180997133 +0000 UTC m=+8597.353147090" watchObservedRunningTime="2025-11-29 03:34:14.193836576 +0000 UTC m=+8597.365986443" Nov 29 03:34:28 crc kubenswrapper[4749]: I1129 03:34:28.075095 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:34:28 crc kubenswrapper[4749]: E1129 03:34:28.075889 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:34:40 crc kubenswrapper[4749]: I1129 03:34:40.076337 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:34:40 crc kubenswrapper[4749]: E1129 03:34:40.077847 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:34:45 crc kubenswrapper[4749]: I1129 03:34:45.781514 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hcl82"] Nov 29 03:34:45 crc kubenswrapper[4749]: I1129 03:34:45.788577 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:45 crc kubenswrapper[4749]: I1129 03:34:45.817944 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hcl82"] Nov 29 03:34:45 crc kubenswrapper[4749]: I1129 03:34:45.955582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-catalog-content\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:45 crc kubenswrapper[4749]: I1129 03:34:45.955960 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-utilities\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:45 crc kubenswrapper[4749]: I1129 03:34:45.956064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt7tg\" (UniqueName: \"kubernetes.io/projected/d9cb41e6-5c41-472d-b5a7-e863850b8a36-kube-api-access-mt7tg\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.057692 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-catalog-content\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.058109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-utilities\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.058176 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-catalog-content\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.058328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt7tg\" (UniqueName: \"kubernetes.io/projected/d9cb41e6-5c41-472d-b5a7-e863850b8a36-kube-api-access-mt7tg\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.058584 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-utilities\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.077484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt7tg\" (UniqueName: \"kubernetes.io/projected/d9cb41e6-5c41-472d-b5a7-e863850b8a36-kube-api-access-mt7tg\") pod \"community-operators-hcl82\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.129724 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.678837 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hcl82"] Nov 29 03:34:46 crc kubenswrapper[4749]: I1129 03:34:46.755360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcl82" event={"ID":"d9cb41e6-5c41-472d-b5a7-e863850b8a36","Type":"ContainerStarted","Data":"ec91fb9dfb3b6e12a0530d7093ca58ae1cb99893452f2c98d6b8e84e466a95cc"} Nov 29 03:34:47 crc kubenswrapper[4749]: I1129 03:34:47.771809 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerID="2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75" exitCode=0 Nov 29 03:34:47 crc kubenswrapper[4749]: I1129 03:34:47.771875 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcl82" event={"ID":"d9cb41e6-5c41-472d-b5a7-e863850b8a36","Type":"ContainerDied","Data":"2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75"} Nov 29 03:34:49 crc kubenswrapper[4749]: I1129 03:34:49.797007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcl82" event={"ID":"d9cb41e6-5c41-472d-b5a7-e863850b8a36","Type":"ContainerStarted","Data":"6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57"} Nov 29 03:34:50 crc kubenswrapper[4749]: I1129 03:34:50.816430 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerID="6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57" exitCode=0 Nov 29 03:34:50 crc kubenswrapper[4749]: I1129 03:34:50.816816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcl82" event={"ID":"d9cb41e6-5c41-472d-b5a7-e863850b8a36","Type":"ContainerDied","Data":"6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57"} Nov 29 03:34:50 crc kubenswrapper[4749]: I1129 03:34:50.816859 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcl82" event={"ID":"d9cb41e6-5c41-472d-b5a7-e863850b8a36","Type":"ContainerStarted","Data":"4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521"} Nov 29 03:34:50 crc kubenswrapper[4749]: I1129 03:34:50.856891 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hcl82" podStartSLOduration=3.321098299 podStartE2EDuration="5.856861128s" podCreationTimestamp="2025-11-29 03:34:45 +0000 UTC" firstStartedPulling="2025-11-29 03:34:47.775455224 +0000 UTC m=+8630.947605121" lastFinishedPulling="2025-11-29 03:34:50.311218083 +0000 UTC m=+8633.483367950" observedRunningTime="2025-11-29 03:34:50.845090001 +0000 UTC m=+8634.017239918" watchObservedRunningTime="2025-11-29 03:34:50.856861128 +0000 UTC m=+8634.029011025" Nov 29 03:34:55 crc kubenswrapper[4749]: I1129 03:34:55.076094 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:34:55 crc kubenswrapper[4749]: E1129 03:34:55.077215 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:34:56 crc kubenswrapper[4749]: I1129 03:34:56.130224 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:56 crc kubenswrapper[4749]: I1129 03:34:56.131344 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:56 crc kubenswrapper[4749]: I1129 03:34:56.211050 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:56 crc kubenswrapper[4749]: I1129 03:34:56.980324 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:57 crc kubenswrapper[4749]: I1129 03:34:57.072688 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hcl82"] Nov 29 03:34:58 crc kubenswrapper[4749]: I1129 03:34:58.906768 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hcl82" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerName="registry-server" containerID="cri-o://4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521" gracePeriod=2 Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.441501 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.584863 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt7tg\" (UniqueName: \"kubernetes.io/projected/d9cb41e6-5c41-472d-b5a7-e863850b8a36-kube-api-access-mt7tg\") pod \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.584935 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-catalog-content\") pod \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.585123 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-utilities\") pod \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\" (UID: \"d9cb41e6-5c41-472d-b5a7-e863850b8a36\") " Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.585915 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-utilities" (OuterVolumeSpecName: "utilities") pod "d9cb41e6-5c41-472d-b5a7-e863850b8a36" (UID: "d9cb41e6-5c41-472d-b5a7-e863850b8a36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.589674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cb41e6-5c41-472d-b5a7-e863850b8a36-kube-api-access-mt7tg" (OuterVolumeSpecName: "kube-api-access-mt7tg") pod "d9cb41e6-5c41-472d-b5a7-e863850b8a36" (UID: "d9cb41e6-5c41-472d-b5a7-e863850b8a36"). InnerVolumeSpecName "kube-api-access-mt7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.688361 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt7tg\" (UniqueName: \"kubernetes.io/projected/d9cb41e6-5c41-472d-b5a7-e863850b8a36-kube-api-access-mt7tg\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.688760 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.689928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9cb41e6-5c41-472d-b5a7-e863850b8a36" (UID: "d9cb41e6-5c41-472d-b5a7-e863850b8a36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.791109 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9cb41e6-5c41-472d-b5a7-e863850b8a36-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.922994 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerID="4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521" exitCode=0 Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.923054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcl82" event={"ID":"d9cb41e6-5c41-472d-b5a7-e863850b8a36","Type":"ContainerDied","Data":"4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521"} Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.923088 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcl82" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.923101 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcl82" event={"ID":"d9cb41e6-5c41-472d-b5a7-e863850b8a36","Type":"ContainerDied","Data":"ec91fb9dfb3b6e12a0530d7093ca58ae1cb99893452f2c98d6b8e84e466a95cc"} Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.923133 4749 scope.go:117] "RemoveContainer" containerID="4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.972917 4749 scope.go:117] "RemoveContainer" containerID="6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57" Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.976563 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hcl82"] Nov 29 03:34:59 crc kubenswrapper[4749]: I1129 03:34:59.994036 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hcl82"] Nov 29 03:35:00 crc kubenswrapper[4749]: I1129 03:35:00.006107 4749 scope.go:117] "RemoveContainer" containerID="2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75" Nov 29 03:35:00 crc kubenswrapper[4749]: I1129 03:35:00.052684 4749 scope.go:117] "RemoveContainer" containerID="4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521" Nov 29 03:35:00 crc kubenswrapper[4749]: E1129 03:35:00.053005 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521\": container with ID starting with 4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521 not found: ID does not exist" containerID="4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521" Nov 29 03:35:00 crc kubenswrapper[4749]: I1129 03:35:00.053033 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521"} err="failed to get container status \"4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521\": rpc error: code = NotFound desc = could not find container \"4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521\": container with ID starting with 4e5235e5d9062b72e232570967c52148132b03fc08edd9bf603bf7bab357f521 not found: ID does not exist" Nov 29 03:35:00 crc kubenswrapper[4749]: I1129 03:35:00.053052 4749 scope.go:117] "RemoveContainer" containerID="6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57" Nov 29 03:35:00 crc kubenswrapper[4749]: E1129 03:35:00.053387 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57\": container with ID starting with 6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57 not found: ID does not exist" containerID="6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57" Nov 29 03:35:00 crc kubenswrapper[4749]: I1129 03:35:00.053428 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57"} err="failed to get container status \"6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57\": rpc error: code = NotFound desc = could not find container \"6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57\": container with ID starting with 6524a218e79e1fa9119e148a6ecf18710af184f98103c10126c48d23b7355c57 not found: ID does not exist" Nov 29 03:35:00 crc kubenswrapper[4749]: I1129 03:35:00.053456 4749 scope.go:117] "RemoveContainer" containerID="2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75" Nov 29 03:35:00 crc kubenswrapper[4749]: E1129 03:35:00.053855 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75\": container with ID starting with 2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75 not found: ID does not exist" containerID="2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75" Nov 29 03:35:00 crc kubenswrapper[4749]: I1129 03:35:00.053878 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75"} err="failed to get container status \"2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75\": rpc error: code = NotFound desc = could not find container \"2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75\": container with ID starting with 2174edcee9455eb08451417bebbd440caac8f397251fa2c1e75ea20e11523a75 not found: ID does not exist" Nov 29 03:35:01 crc kubenswrapper[4749]: I1129 03:35:01.091869 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" path="/var/lib/kubelet/pods/d9cb41e6-5c41-472d-b5a7-e863850b8a36/volumes" Nov 29 03:35:06 crc kubenswrapper[4749]: I1129 03:35:06.074710 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:35:06 crc kubenswrapper[4749]: E1129 03:35:06.075546 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:35:19 crc kubenswrapper[4749]: I1129 03:35:19.075397 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:35:19 crc kubenswrapper[4749]: E1129 03:35:19.076486 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:35:31 crc kubenswrapper[4749]: I1129 03:35:31.076784 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:35:31 crc kubenswrapper[4749]: E1129 03:35:31.078038 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:35:42 crc kubenswrapper[4749]: I1129 03:35:42.891474 4749 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-kpsn9 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 29 03:35:42 crc kubenswrapper[4749]: I1129 03:35:42.892137 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kpsn9" podUID="e3bc3d9f-ca91-4ab2-99f2-0558be9adf59" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 03:35:43 crc kubenswrapper[4749]: I1129 03:35:43.076380 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:35:43 crc kubenswrapper[4749]: E1129 03:35:43.076947 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:35:55 crc kubenswrapper[4749]: I1129 03:35:55.076144 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:35:55 crc kubenswrapper[4749]: E1129 03:35:55.077376 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:36:10 crc kubenswrapper[4749]: I1129 03:36:10.075797 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:36:10 crc kubenswrapper[4749]: E1129 03:36:10.076837 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:36:23 crc kubenswrapper[4749]: I1129 03:36:23.076259 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:36:23 crc kubenswrapper[4749]: E1129 03:36:23.076965 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:36:36 crc kubenswrapper[4749]: I1129 03:36:36.078301 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:36:36 crc kubenswrapper[4749]: E1129 03:36:36.079805 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:36:50 crc kubenswrapper[4749]: I1129 03:36:50.077120 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:36:50 crc kubenswrapper[4749]: E1129 03:36:50.078543 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:37:04 crc kubenswrapper[4749]: I1129 03:37:04.075818 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:37:04 crc kubenswrapper[4749]: I1129 03:37:04.504085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"ab0f4311a913e2b86c25c5800f847627aee2740cd1de1da90d15b6b9d0517539"} Nov 29 03:37:27 crc kubenswrapper[4749]: I1129 03:37:27.062181 4749 scope.go:117] "RemoveContainer" containerID="5644c5084e0f00932e2a4812915720d5fa30298771faed009827248eb4d28168" Nov 29 03:37:27 crc kubenswrapper[4749]: I1129 03:37:27.127812 4749 scope.go:117] "RemoveContainer" containerID="6eac411da97e1ef72e6d427cff8059c3c6b961abb9d0d74714ab0bcfd7865f6d" Nov 29 03:37:27 crc kubenswrapper[4749]: I1129 03:37:27.178441 4749 scope.go:117] "RemoveContainer" containerID="289380718c40d22cd49cfde1a1fa1828a6882829b70207b9433f0771d46a2d92" Nov 29 03:37:34 crc kubenswrapper[4749]: I1129 03:37:34.898909 4749 generic.go:334] "Generic (PLEG): container finished" podID="8e10fa10-d91b-497f-801e-2b6093ebdb8d" containerID="8efc02866887fe32a98d3619b22cdf1398ddbe8faba7d52b9fe64e23ade5c306" exitCode=0 Nov 29 03:37:34 crc kubenswrapper[4749]: I1129 03:37:34.899474 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" event={"ID":"8e10fa10-d91b-497f-801e-2b6093ebdb8d","Type":"ContainerDied","Data":"8efc02866887fe32a98d3619b22cdf1398ddbe8faba7d52b9fe64e23ade5c306"} Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.457271 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.567192 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-agent-neutron-config-0\") pod \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.567289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ceph\") pod \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.567465 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-combined-ca-bundle\") pod \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.567629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ssh-key\") pod \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.567720 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbcvf\" (UniqueName: \"kubernetes.io/projected/8e10fa10-d91b-497f-801e-2b6093ebdb8d-kube-api-access-vbcvf\") pod \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.567770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-inventory\") pod \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\" (UID: \"8e10fa10-d91b-497f-801e-2b6093ebdb8d\") " Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.573488 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "8e10fa10-d91b-497f-801e-2b6093ebdb8d" (UID: "8e10fa10-d91b-497f-801e-2b6093ebdb8d"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.578665 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e10fa10-d91b-497f-801e-2b6093ebdb8d-kube-api-access-vbcvf" (OuterVolumeSpecName: "kube-api-access-vbcvf") pod "8e10fa10-d91b-497f-801e-2b6093ebdb8d" (UID: "8e10fa10-d91b-497f-801e-2b6093ebdb8d"). InnerVolumeSpecName "kube-api-access-vbcvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.579700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ceph" (OuterVolumeSpecName: "ceph") pod "8e10fa10-d91b-497f-801e-2b6093ebdb8d" (UID: "8e10fa10-d91b-497f-801e-2b6093ebdb8d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.600190 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "8e10fa10-d91b-497f-801e-2b6093ebdb8d" (UID: "8e10fa10-d91b-497f-801e-2b6093ebdb8d"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.618836 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-inventory" (OuterVolumeSpecName: "inventory") pod "8e10fa10-d91b-497f-801e-2b6093ebdb8d" (UID: "8e10fa10-d91b-497f-801e-2b6093ebdb8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.623178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8e10fa10-d91b-497f-801e-2b6093ebdb8d" (UID: "8e10fa10-d91b-497f-801e-2b6093ebdb8d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.670857 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.670891 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbcvf\" (UniqueName: \"kubernetes.io/projected/8e10fa10-d91b-497f-801e-2b6093ebdb8d-kube-api-access-vbcvf\") on node \"crc\" DevicePath \"\"" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.670904 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.670915 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.670925 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.670935 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e10fa10-d91b-497f-801e-2b6093ebdb8d-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.923662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" event={"ID":"8e10fa10-d91b-497f-801e-2b6093ebdb8d","Type":"ContainerDied","Data":"c9cd25b0950eea51db379344854daf3edc8d8ec1c462eec9bdd9cb15c9ba4618"} Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.923703 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9cd25b0950eea51db379344854daf3edc8d8ec1c462eec9bdd9cb15c9ba4618" Nov 29 03:37:36 crc kubenswrapper[4749]: I1129 03:37:36.923755 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-gnwc6" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.046547 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt"] Nov 29 03:37:37 crc kubenswrapper[4749]: E1129 03:37:37.046960 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerName="registry-server" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.046978 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerName="registry-server" Nov 29 03:37:37 crc kubenswrapper[4749]: E1129 03:37:37.046998 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e10fa10-d91b-497f-801e-2b6093ebdb8d" containerName="neutron-sriov-openstack-openstack-cell1" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.047005 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e10fa10-d91b-497f-801e-2b6093ebdb8d" containerName="neutron-sriov-openstack-openstack-cell1" Nov 29 03:37:37 crc kubenswrapper[4749]: E1129 03:37:37.047018 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerName="extract-content" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.047025 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerName="extract-content" Nov 29 03:37:37 crc kubenswrapper[4749]: E1129 03:37:37.047040 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerName="extract-utilities" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.047046 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerName="extract-utilities" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.047248 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cb41e6-5c41-472d-b5a7-e863850b8a36" containerName="registry-server" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.047273 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e10fa10-d91b-497f-801e-2b6093ebdb8d" containerName="neutron-sriov-openstack-openstack-cell1" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.047975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.050083 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.050174 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.050368 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.050465 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.052632 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.064399 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt"] Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.078312 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.078364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.078423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.078469 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7bc\" (UniqueName: \"kubernetes.io/projected/89a85fb9-4f52-4edb-a999-f5e373694943-kube-api-access-2j7bc\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.078489 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.078515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.180444 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.180543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.180650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.180718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7bc\" (UniqueName: \"kubernetes.io/projected/89a85fb9-4f52-4edb-a999-f5e373694943-kube-api-access-2j7bc\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.180745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.180788 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.185347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.185669 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.185746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.186419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.186851 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.202303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7bc\" (UniqueName: \"kubernetes.io/projected/89a85fb9-4f52-4edb-a999-f5e373694943-kube-api-access-2j7bc\") pod \"neutron-dhcp-openstack-openstack-cell1-jl4rt\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.368673 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.958878 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:37:37 crc kubenswrapper[4749]: I1129 03:37:37.963470 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt"] Nov 29 03:37:38 crc kubenswrapper[4749]: I1129 03:37:38.946254 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" event={"ID":"89a85fb9-4f52-4edb-a999-f5e373694943","Type":"ContainerStarted","Data":"3300f511428bceb4aead65648688393a922524e7c23fbcd365c908edc380f2ea"} Nov 29 03:37:38 crc kubenswrapper[4749]: I1129 03:37:38.946495 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" event={"ID":"89a85fb9-4f52-4edb-a999-f5e373694943","Type":"ContainerStarted","Data":"68e948cf3a0606439c4eaadf379f69515560a7c67b738040617d92c539b2de78"} Nov 29 03:37:38 crc kubenswrapper[4749]: I1129 03:37:38.969881 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" podStartSLOduration=1.482478549 podStartE2EDuration="1.969860605s" podCreationTimestamp="2025-11-29 03:37:37 +0000 UTC" firstStartedPulling="2025-11-29 03:37:37.958405889 +0000 UTC m=+8801.130555786" lastFinishedPulling="2025-11-29 03:37:38.445787975 +0000 UTC m=+8801.617937842" observedRunningTime="2025-11-29 03:37:38.962841324 +0000 UTC m=+8802.134991241" watchObservedRunningTime="2025-11-29 03:37:38.969860605 +0000 UTC m=+8802.142010472" Nov 29 03:39:11 crc kubenswrapper[4749]: I1129 03:39:11.919159 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fgmzn"] Nov 29 03:39:11 crc kubenswrapper[4749]: I1129 03:39:11.923694 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:11 crc kubenswrapper[4749]: I1129 03:39:11.934901 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fgmzn"] Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.049314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-utilities\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.049501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-catalog-content\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.049712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsx7w\" (UniqueName: \"kubernetes.io/projected/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-kube-api-access-qsx7w\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.152273 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsx7w\" (UniqueName: \"kubernetes.io/projected/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-kube-api-access-qsx7w\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.154059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-utilities\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.154380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-catalog-content\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.155004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-utilities\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.155062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-catalog-content\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.190523 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsx7w\" (UniqueName: \"kubernetes.io/projected/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-kube-api-access-qsx7w\") pod \"certified-operators-fgmzn\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.259032 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:12 crc kubenswrapper[4749]: I1129 03:39:12.827879 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fgmzn"] Nov 29 03:39:13 crc kubenswrapper[4749]: I1129 03:39:13.141165 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerID="d7038e31c1fa80e2b96e53c64b240314b35770699df56a4e921544099d5becbd" exitCode=0 Nov 29 03:39:13 crc kubenswrapper[4749]: I1129 03:39:13.141227 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgmzn" event={"ID":"ed6f93ca-1d66-4455-a367-5ca4a5d1442f","Type":"ContainerDied","Data":"d7038e31c1fa80e2b96e53c64b240314b35770699df56a4e921544099d5becbd"} Nov 29 03:39:13 crc kubenswrapper[4749]: I1129 03:39:13.141477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgmzn" event={"ID":"ed6f93ca-1d66-4455-a367-5ca4a5d1442f","Type":"ContainerStarted","Data":"b86fce82cb3df167baed054f3c500183534fea163c34c765bd925865b79f2b33"} Nov 29 03:39:14 crc kubenswrapper[4749]: I1129 03:39:14.160316 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgmzn" event={"ID":"ed6f93ca-1d66-4455-a367-5ca4a5d1442f","Type":"ContainerStarted","Data":"cb7014033e9321cfee398cc2d67ba7aa8dea2b5b66b46079782ade323f696aa5"} Nov 29 03:39:15 crc kubenswrapper[4749]: I1129 03:39:15.178126 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerID="cb7014033e9321cfee398cc2d67ba7aa8dea2b5b66b46079782ade323f696aa5" exitCode=0 Nov 29 03:39:15 crc kubenswrapper[4749]: I1129 03:39:15.178296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgmzn" event={"ID":"ed6f93ca-1d66-4455-a367-5ca4a5d1442f","Type":"ContainerDied","Data":"cb7014033e9321cfee398cc2d67ba7aa8dea2b5b66b46079782ade323f696aa5"} Nov 29 03:39:16 crc kubenswrapper[4749]: I1129 03:39:16.199482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgmzn" event={"ID":"ed6f93ca-1d66-4455-a367-5ca4a5d1442f","Type":"ContainerStarted","Data":"399534f9fb3d312ff1a21933e21bbc3c8d014f3a1b2920afca152db2b62f3b69"} Nov 29 03:39:16 crc kubenswrapper[4749]: I1129 03:39:16.235532 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fgmzn" podStartSLOduration=2.711294818 podStartE2EDuration="5.235509167s" podCreationTimestamp="2025-11-29 03:39:11 +0000 UTC" firstStartedPulling="2025-11-29 03:39:13.143325671 +0000 UTC m=+8896.315475528" lastFinishedPulling="2025-11-29 03:39:15.66754 +0000 UTC m=+8898.839689877" observedRunningTime="2025-11-29 03:39:16.226916458 +0000 UTC m=+8899.399066375" watchObservedRunningTime="2025-11-29 03:39:16.235509167 +0000 UTC m=+8899.407659064" Nov 29 03:39:22 crc kubenswrapper[4749]: I1129 03:39:22.259765 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:22 crc kubenswrapper[4749]: I1129 03:39:22.260773 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:22 crc kubenswrapper[4749]: I1129 03:39:22.357795 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:23 crc kubenswrapper[4749]: I1129 03:39:23.334328 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:23 crc kubenswrapper[4749]: I1129 03:39:23.383280 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fgmzn"] Nov 29 03:39:25 crc kubenswrapper[4749]: I1129 03:39:25.300066 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fgmzn" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerName="registry-server" containerID="cri-o://399534f9fb3d312ff1a21933e21bbc3c8d014f3a1b2920afca152db2b62f3b69" gracePeriod=2 Nov 29 03:39:25 crc kubenswrapper[4749]: I1129 03:39:25.375297 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:39:25 crc kubenswrapper[4749]: I1129 03:39:25.375628 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.316759 4749 generic.go:334] "Generic (PLEG): container finished" podID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerID="399534f9fb3d312ff1a21933e21bbc3c8d014f3a1b2920afca152db2b62f3b69" exitCode=0 Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.316824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgmzn" event={"ID":"ed6f93ca-1d66-4455-a367-5ca4a5d1442f","Type":"ContainerDied","Data":"399534f9fb3d312ff1a21933e21bbc3c8d014f3a1b2920afca152db2b62f3b69"} Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.317177 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fgmzn" event={"ID":"ed6f93ca-1d66-4455-a367-5ca4a5d1442f","Type":"ContainerDied","Data":"b86fce82cb3df167baed054f3c500183534fea163c34c765bd925865b79f2b33"} Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.317221 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b86fce82cb3df167baed054f3c500183534fea163c34c765bd925865b79f2b33" Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.341490 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.540482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-catalog-content\") pod \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.540733 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsx7w\" (UniqueName: \"kubernetes.io/projected/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-kube-api-access-qsx7w\") pod \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.540816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-utilities\") pod \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\" (UID: \"ed6f93ca-1d66-4455-a367-5ca4a5d1442f\") " Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.542818 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-utilities" (OuterVolumeSpecName: "utilities") pod "ed6f93ca-1d66-4455-a367-5ca4a5d1442f" (UID: "ed6f93ca-1d66-4455-a367-5ca4a5d1442f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.558862 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-kube-api-access-qsx7w" (OuterVolumeSpecName: "kube-api-access-qsx7w") pod "ed6f93ca-1d66-4455-a367-5ca4a5d1442f" (UID: "ed6f93ca-1d66-4455-a367-5ca4a5d1442f"). InnerVolumeSpecName "kube-api-access-qsx7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.616156 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed6f93ca-1d66-4455-a367-5ca4a5d1442f" (UID: "ed6f93ca-1d66-4455-a367-5ca4a5d1442f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.644472 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsx7w\" (UniqueName: \"kubernetes.io/projected/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-kube-api-access-qsx7w\") on node \"crc\" DevicePath \"\"" Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.644566 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:39:26 crc kubenswrapper[4749]: I1129 03:39:26.644587 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed6f93ca-1d66-4455-a367-5ca4a5d1442f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:39:27 crc kubenswrapper[4749]: I1129 03:39:27.329616 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fgmzn" Nov 29 03:39:27 crc kubenswrapper[4749]: I1129 03:39:27.377232 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fgmzn"] Nov 29 03:39:27 crc kubenswrapper[4749]: I1129 03:39:27.392222 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fgmzn"] Nov 29 03:39:29 crc kubenswrapper[4749]: I1129 03:39:29.101279 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" path="/var/lib/kubelet/pods/ed6f93ca-1d66-4455-a367-5ca4a5d1442f/volumes" Nov 29 03:39:55 crc kubenswrapper[4749]: I1129 03:39:55.374964 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:39:55 crc kubenswrapper[4749]: I1129 03:39:55.375614 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:40:25 crc kubenswrapper[4749]: I1129 03:40:25.374746 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:40:25 crc kubenswrapper[4749]: I1129 03:40:25.375587 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:40:25 crc kubenswrapper[4749]: I1129 03:40:25.375662 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:40:25 crc kubenswrapper[4749]: I1129 03:40:25.376863 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab0f4311a913e2b86c25c5800f847627aee2740cd1de1da90d15b6b9d0517539"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:40:25 crc kubenswrapper[4749]: I1129 03:40:25.376964 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://ab0f4311a913e2b86c25c5800f847627aee2740cd1de1da90d15b6b9d0517539" gracePeriod=600 Nov 29 03:40:26 crc kubenswrapper[4749]: I1129 03:40:26.066872 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="ab0f4311a913e2b86c25c5800f847627aee2740cd1de1da90d15b6b9d0517539" exitCode=0 Nov 29 03:40:26 crc kubenswrapper[4749]: I1129 03:40:26.066960 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"ab0f4311a913e2b86c25c5800f847627aee2740cd1de1da90d15b6b9d0517539"} Nov 29 03:40:26 crc kubenswrapper[4749]: I1129 03:40:26.067645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea"} Nov 29 03:40:26 crc kubenswrapper[4749]: I1129 03:40:26.067664 4749 scope.go:117] "RemoveContainer" containerID="c0b67d2e806fa6b3f142a9c4d8eccb897b60958a95b4ce845259f83c008806b6" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.283214 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8cchf"] Nov 29 03:40:59 crc kubenswrapper[4749]: E1129 03:40:59.296121 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerName="registry-server" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.296175 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerName="registry-server" Nov 29 03:40:59 crc kubenswrapper[4749]: E1129 03:40:59.296258 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerName="extract-utilities" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.296270 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerName="extract-utilities" Nov 29 03:40:59 crc kubenswrapper[4749]: E1129 03:40:59.296319 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerName="extract-content" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.296326 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerName="extract-content" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.301956 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed6f93ca-1d66-4455-a367-5ca4a5d1442f" containerName="registry-server" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.303828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cchf"] Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.303949 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.336345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-utilities\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.336729 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-catalog-content\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.336874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdzv\" (UniqueName: \"kubernetes.io/projected/0b3fcdcd-b880-4746-81c2-40dc83e32250-kube-api-access-xwdzv\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.439623 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdzv\" (UniqueName: \"kubernetes.io/projected/0b3fcdcd-b880-4746-81c2-40dc83e32250-kube-api-access-xwdzv\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.439774 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-utilities\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.439910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-catalog-content\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.440480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-utilities\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.440491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-catalog-content\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.462912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdzv\" (UniqueName: \"kubernetes.io/projected/0b3fcdcd-b880-4746-81c2-40dc83e32250-kube-api-access-xwdzv\") pod \"redhat-marketplace-8cchf\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:40:59 crc kubenswrapper[4749]: I1129 03:40:59.630029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:41:00 crc kubenswrapper[4749]: I1129 03:41:00.222407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cchf"] Nov 29 03:41:00 crc kubenswrapper[4749]: I1129 03:41:00.550573 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerID="f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b" exitCode=0 Nov 29 03:41:00 crc kubenswrapper[4749]: I1129 03:41:00.550861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cchf" event={"ID":"0b3fcdcd-b880-4746-81c2-40dc83e32250","Type":"ContainerDied","Data":"f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b"} Nov 29 03:41:00 crc kubenswrapper[4749]: I1129 03:41:00.551057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cchf" event={"ID":"0b3fcdcd-b880-4746-81c2-40dc83e32250","Type":"ContainerStarted","Data":"137111a161a59ce1f9b037113c2e7eba1aa10e51813067d76402a22c12ba160a"} Nov 29 03:41:03 crc kubenswrapper[4749]: I1129 03:41:03.587652 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerID="f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8" exitCode=0 Nov 29 03:41:03 crc kubenswrapper[4749]: I1129 03:41:03.587824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cchf" event={"ID":"0b3fcdcd-b880-4746-81c2-40dc83e32250","Type":"ContainerDied","Data":"f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8"} Nov 29 03:41:04 crc kubenswrapper[4749]: I1129 03:41:04.601849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cchf" event={"ID":"0b3fcdcd-b880-4746-81c2-40dc83e32250","Type":"ContainerStarted","Data":"ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395"} Nov 29 03:41:04 crc kubenswrapper[4749]: I1129 03:41:04.629301 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8cchf" podStartSLOduration=2.100899097 podStartE2EDuration="5.629281295s" podCreationTimestamp="2025-11-29 03:40:59 +0000 UTC" firstStartedPulling="2025-11-29 03:41:00.553292335 +0000 UTC m=+9003.725442192" lastFinishedPulling="2025-11-29 03:41:04.081674533 +0000 UTC m=+9007.253824390" observedRunningTime="2025-11-29 03:41:04.623451623 +0000 UTC m=+9007.795601530" watchObservedRunningTime="2025-11-29 03:41:04.629281295 +0000 UTC m=+9007.801431162" Nov 29 03:41:09 crc kubenswrapper[4749]: I1129 03:41:09.631156 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:41:09 crc kubenswrapper[4749]: I1129 03:41:09.632144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:41:09 crc kubenswrapper[4749]: I1129 03:41:09.689221 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:41:09 crc kubenswrapper[4749]: I1129 03:41:09.769106 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:41:09 crc kubenswrapper[4749]: I1129 03:41:09.940769 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cchf"] Nov 29 03:41:11 crc kubenswrapper[4749]: I1129 03:41:11.737424 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8cchf" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerName="registry-server" containerID="cri-o://ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395" gracePeriod=2 Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.282783 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.359705 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-utilities\") pod \"0b3fcdcd-b880-4746-81c2-40dc83e32250\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.359780 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-catalog-content\") pod \"0b3fcdcd-b880-4746-81c2-40dc83e32250\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.359969 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwdzv\" (UniqueName: \"kubernetes.io/projected/0b3fcdcd-b880-4746-81c2-40dc83e32250-kube-api-access-xwdzv\") pod \"0b3fcdcd-b880-4746-81c2-40dc83e32250\" (UID: \"0b3fcdcd-b880-4746-81c2-40dc83e32250\") " Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.360709 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-utilities" (OuterVolumeSpecName: "utilities") pod "0b3fcdcd-b880-4746-81c2-40dc83e32250" (UID: "0b3fcdcd-b880-4746-81c2-40dc83e32250"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.361245 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.368546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b3fcdcd-b880-4746-81c2-40dc83e32250-kube-api-access-xwdzv" (OuterVolumeSpecName: "kube-api-access-xwdzv") pod "0b3fcdcd-b880-4746-81c2-40dc83e32250" (UID: "0b3fcdcd-b880-4746-81c2-40dc83e32250"). InnerVolumeSpecName "kube-api-access-xwdzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.389497 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b3fcdcd-b880-4746-81c2-40dc83e32250" (UID: "0b3fcdcd-b880-4746-81c2-40dc83e32250"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.464004 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwdzv\" (UniqueName: \"kubernetes.io/projected/0b3fcdcd-b880-4746-81c2-40dc83e32250-kube-api-access-xwdzv\") on node \"crc\" DevicePath \"\"" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.464058 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b3fcdcd-b880-4746-81c2-40dc83e32250-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.751594 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerID="ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395" exitCode=0 Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.751680 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cchf" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.751673 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cchf" event={"ID":"0b3fcdcd-b880-4746-81c2-40dc83e32250","Type":"ContainerDied","Data":"ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395"} Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.753069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cchf" event={"ID":"0b3fcdcd-b880-4746-81c2-40dc83e32250","Type":"ContainerDied","Data":"137111a161a59ce1f9b037113c2e7eba1aa10e51813067d76402a22c12ba160a"} Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.753112 4749 scope.go:117] "RemoveContainer" containerID="ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.795845 4749 scope.go:117] "RemoveContainer" containerID="f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.828487 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cchf"] Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.842642 4749 scope.go:117] "RemoveContainer" containerID="f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.856957 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cchf"] Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.898709 4749 scope.go:117] "RemoveContainer" containerID="ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395" Nov 29 03:41:12 crc kubenswrapper[4749]: E1129 03:41:12.899312 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395\": container with ID starting with ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395 not found: ID does not exist" containerID="ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.899379 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395"} err="failed to get container status \"ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395\": rpc error: code = NotFound desc = could not find container \"ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395\": container with ID starting with ccfe4c698396d9c52fe073dfa1540ed60a43789dfb8ab822c0febd92847c9395 not found: ID does not exist" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.899412 4749 scope.go:117] "RemoveContainer" containerID="f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8" Nov 29 03:41:12 crc kubenswrapper[4749]: E1129 03:41:12.899737 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8\": container with ID starting with f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8 not found: ID does not exist" containerID="f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.899758 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8"} err="failed to get container status \"f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8\": rpc error: code = NotFound desc = could not find container \"f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8\": container with ID starting with f394b9f93b67bda792083905271a560f30d1f193f34e49a668e8ae16e5be41b8 not found: ID does not exist" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.899771 4749 scope.go:117] "RemoveContainer" containerID="f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b" Nov 29 03:41:12 crc kubenswrapper[4749]: E1129 03:41:12.900257 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b\": container with ID starting with f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b not found: ID does not exist" containerID="f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b" Nov 29 03:41:12 crc kubenswrapper[4749]: I1129 03:41:12.900299 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b"} err="failed to get container status \"f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b\": rpc error: code = NotFound desc = could not find container \"f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b\": container with ID starting with f87fbf1800bce95d5dae3a1aa55658c071803bfb2c0fd8d45e764739a0e7e43b not found: ID does not exist" Nov 29 03:41:13 crc kubenswrapper[4749]: I1129 03:41:13.090935 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" path="/var/lib/kubelet/pods/0b3fcdcd-b880-4746-81c2-40dc83e32250/volumes" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.331497 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k4gk9"] Nov 29 03:42:05 crc kubenswrapper[4749]: E1129 03:42:05.332785 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerName="extract-utilities" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.332804 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerName="extract-utilities" Nov 29 03:42:05 crc kubenswrapper[4749]: E1129 03:42:05.332832 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerName="registry-server" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.332842 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerName="registry-server" Nov 29 03:42:05 crc kubenswrapper[4749]: E1129 03:42:05.332855 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerName="extract-content" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.332862 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerName="extract-content" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.333175 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b3fcdcd-b880-4746-81c2-40dc83e32250" containerName="registry-server" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.335080 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.349459 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4gk9"] Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.400781 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-utilities\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.400831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72fwp\" (UniqueName: \"kubernetes.io/projected/f44cbfd4-29d2-4bf5-8744-2ca34aece867-kube-api-access-72fwp\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.400873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-catalog-content\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.502964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-utilities\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.503009 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72fwp\" (UniqueName: \"kubernetes.io/projected/f44cbfd4-29d2-4bf5-8744-2ca34aece867-kube-api-access-72fwp\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.503063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-catalog-content\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.503600 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-utilities\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.503684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-catalog-content\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.521512 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72fwp\" (UniqueName: \"kubernetes.io/projected/f44cbfd4-29d2-4bf5-8744-2ca34aece867-kube-api-access-72fwp\") pod \"redhat-operators-k4gk9\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:05 crc kubenswrapper[4749]: I1129 03:42:05.666133 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:06 crc kubenswrapper[4749]: I1129 03:42:06.212613 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k4gk9"] Nov 29 03:42:06 crc kubenswrapper[4749]: I1129 03:42:06.468653 4749 generic.go:334] "Generic (PLEG): container finished" podID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerID="c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284" exitCode=0 Nov 29 03:42:06 crc kubenswrapper[4749]: I1129 03:42:06.468879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gk9" event={"ID":"f44cbfd4-29d2-4bf5-8744-2ca34aece867","Type":"ContainerDied","Data":"c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284"} Nov 29 03:42:06 crc kubenswrapper[4749]: I1129 03:42:06.468903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gk9" event={"ID":"f44cbfd4-29d2-4bf5-8744-2ca34aece867","Type":"ContainerStarted","Data":"f8bbbf760b5e23df686ad77b22efbd4953432cea7f8ab7744be9cce7f2539bda"} Nov 29 03:42:06 crc kubenswrapper[4749]: E1129 03:42:06.634987 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf44cbfd4_29d2_4bf5_8744_2ca34aece867.slice/crio-c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf44cbfd4_29d2_4bf5_8744_2ca34aece867.slice/crio-conmon-c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284.scope\": RecentStats: unable to find data in memory cache]" Nov 29 03:42:07 crc kubenswrapper[4749]: I1129 03:42:07.488582 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gk9" event={"ID":"f44cbfd4-29d2-4bf5-8744-2ca34aece867","Type":"ContainerStarted","Data":"1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c"} Nov 29 03:42:10 crc kubenswrapper[4749]: I1129 03:42:10.560384 4749 generic.go:334] "Generic (PLEG): container finished" podID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerID="1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c" exitCode=0 Nov 29 03:42:10 crc kubenswrapper[4749]: I1129 03:42:10.560608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gk9" event={"ID":"f44cbfd4-29d2-4bf5-8744-2ca34aece867","Type":"ContainerDied","Data":"1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c"} Nov 29 03:42:11 crc kubenswrapper[4749]: I1129 03:42:11.578610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gk9" event={"ID":"f44cbfd4-29d2-4bf5-8744-2ca34aece867","Type":"ContainerStarted","Data":"b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac"} Nov 29 03:42:11 crc kubenswrapper[4749]: I1129 03:42:11.600736 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k4gk9" podStartSLOduration=2.066880498 podStartE2EDuration="6.600712856s" podCreationTimestamp="2025-11-29 03:42:05 +0000 UTC" firstStartedPulling="2025-11-29 03:42:06.470432667 +0000 UTC m=+9069.642582514" lastFinishedPulling="2025-11-29 03:42:11.004265005 +0000 UTC m=+9074.176414872" observedRunningTime="2025-11-29 03:42:11.599464655 +0000 UTC m=+9074.771614542" watchObservedRunningTime="2025-11-29 03:42:11.600712856 +0000 UTC m=+9074.772862723" Nov 29 03:42:13 crc kubenswrapper[4749]: I1129 03:42:13.604769 4749 generic.go:334] "Generic (PLEG): container finished" podID="89a85fb9-4f52-4edb-a999-f5e373694943" containerID="3300f511428bceb4aead65648688393a922524e7c23fbcd365c908edc380f2ea" exitCode=0 Nov 29 03:42:13 crc kubenswrapper[4749]: I1129 03:42:13.604958 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" event={"ID":"89a85fb9-4f52-4edb-a999-f5e373694943","Type":"ContainerDied","Data":"3300f511428bceb4aead65648688393a922524e7c23fbcd365c908edc380f2ea"} Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.102531 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.126287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-agent-neutron-config-0\") pod \"89a85fb9-4f52-4edb-a999-f5e373694943\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.126343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-combined-ca-bundle\") pod \"89a85fb9-4f52-4edb-a999-f5e373694943\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.126378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-inventory\") pod \"89a85fb9-4f52-4edb-a999-f5e373694943\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.126439 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ssh-key\") pod \"89a85fb9-4f52-4edb-a999-f5e373694943\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.126635 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ceph\") pod \"89a85fb9-4f52-4edb-a999-f5e373694943\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.126714 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7bc\" (UniqueName: \"kubernetes.io/projected/89a85fb9-4f52-4edb-a999-f5e373694943-kube-api-access-2j7bc\") pod \"89a85fb9-4f52-4edb-a999-f5e373694943\" (UID: \"89a85fb9-4f52-4edb-a999-f5e373694943\") " Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.134146 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "89a85fb9-4f52-4edb-a999-f5e373694943" (UID: "89a85fb9-4f52-4edb-a999-f5e373694943"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.134788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a85fb9-4f52-4edb-a999-f5e373694943-kube-api-access-2j7bc" (OuterVolumeSpecName: "kube-api-access-2j7bc") pod "89a85fb9-4f52-4edb-a999-f5e373694943" (UID: "89a85fb9-4f52-4edb-a999-f5e373694943"). InnerVolumeSpecName "kube-api-access-2j7bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.148959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ceph" (OuterVolumeSpecName: "ceph") pod "89a85fb9-4f52-4edb-a999-f5e373694943" (UID: "89a85fb9-4f52-4edb-a999-f5e373694943"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.191447 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-inventory" (OuterVolumeSpecName: "inventory") pod "89a85fb9-4f52-4edb-a999-f5e373694943" (UID: "89a85fb9-4f52-4edb-a999-f5e373694943"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.212333 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "89a85fb9-4f52-4edb-a999-f5e373694943" (UID: "89a85fb9-4f52-4edb-a999-f5e373694943"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.219821 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "89a85fb9-4f52-4edb-a999-f5e373694943" (UID: "89a85fb9-4f52-4edb-a999-f5e373694943"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.228725 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.228756 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.228768 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.228776 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.228785 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/89a85fb9-4f52-4edb-a999-f5e373694943-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.228794 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7bc\" (UniqueName: \"kubernetes.io/projected/89a85fb9-4f52-4edb-a999-f5e373694943-kube-api-access-2j7bc\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.630718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" event={"ID":"89a85fb9-4f52-4edb-a999-f5e373694943","Type":"ContainerDied","Data":"68e948cf3a0606439c4eaadf379f69515560a7c67b738040617d92c539b2de78"} Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.630760 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e948cf3a0606439c4eaadf379f69515560a7c67b738040617d92c539b2de78" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.631092 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-jl4rt" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.666464 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:15 crc kubenswrapper[4749]: I1129 03:42:15.670371 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:16 crc kubenswrapper[4749]: I1129 03:42:16.733347 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k4gk9" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="registry-server" probeResult="failure" output=< Nov 29 03:42:16 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 03:42:16 crc kubenswrapper[4749]: > Nov 29 03:42:25 crc kubenswrapper[4749]: I1129 03:42:25.374322 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:42:25 crc kubenswrapper[4749]: I1129 03:42:25.374910 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:42:25 crc kubenswrapper[4749]: I1129 03:42:25.714427 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:25 crc kubenswrapper[4749]: I1129 03:42:25.785059 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:25 crc kubenswrapper[4749]: I1129 03:42:25.966231 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4gk9"] Nov 29 03:42:26 crc kubenswrapper[4749]: I1129 03:42:26.781772 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k4gk9" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="registry-server" containerID="cri-o://b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac" gracePeriod=2 Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.367070 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.463082 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-catalog-content\") pod \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.463798 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-utilities\") pod \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.463917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72fwp\" (UniqueName: \"kubernetes.io/projected/f44cbfd4-29d2-4bf5-8744-2ca34aece867-kube-api-access-72fwp\") pod \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\" (UID: \"f44cbfd4-29d2-4bf5-8744-2ca34aece867\") " Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.464650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-utilities" (OuterVolumeSpecName: "utilities") pod "f44cbfd4-29d2-4bf5-8744-2ca34aece867" (UID: "f44cbfd4-29d2-4bf5-8744-2ca34aece867"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.470010 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44cbfd4-29d2-4bf5-8744-2ca34aece867-kube-api-access-72fwp" (OuterVolumeSpecName: "kube-api-access-72fwp") pod "f44cbfd4-29d2-4bf5-8744-2ca34aece867" (UID: "f44cbfd4-29d2-4bf5-8744-2ca34aece867"). InnerVolumeSpecName "kube-api-access-72fwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.567524 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.567568 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72fwp\" (UniqueName: \"kubernetes.io/projected/f44cbfd4-29d2-4bf5-8744-2ca34aece867-kube-api-access-72fwp\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.600161 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f44cbfd4-29d2-4bf5-8744-2ca34aece867" (UID: "f44cbfd4-29d2-4bf5-8744-2ca34aece867"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.670923 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f44cbfd4-29d2-4bf5-8744-2ca34aece867-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.797112 4749 generic.go:334] "Generic (PLEG): container finished" podID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerID="b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac" exitCode=0 Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.797159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gk9" event={"ID":"f44cbfd4-29d2-4bf5-8744-2ca34aece867","Type":"ContainerDied","Data":"b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac"} Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.797191 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k4gk9" event={"ID":"f44cbfd4-29d2-4bf5-8744-2ca34aece867","Type":"ContainerDied","Data":"f8bbbf760b5e23df686ad77b22efbd4953432cea7f8ab7744be9cce7f2539bda"} Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.797243 4749 scope.go:117] "RemoveContainer" containerID="b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.797315 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k4gk9" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.843010 4749 scope.go:117] "RemoveContainer" containerID="1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.886740 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k4gk9"] Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.893042 4749 scope.go:117] "RemoveContainer" containerID="c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.898073 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k4gk9"] Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.952238 4749 scope.go:117] "RemoveContainer" containerID="b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac" Nov 29 03:42:27 crc kubenswrapper[4749]: E1129 03:42:27.952677 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac\": container with ID starting with b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac not found: ID does not exist" containerID="b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.952731 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac"} err="failed to get container status \"b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac\": rpc error: code = NotFound desc = could not find container \"b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac\": container with ID starting with b32228aa19612569bd5c8297c30d33786826f570c94a38131e9cd3a1725f26ac not found: ID does not exist" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.952769 4749 scope.go:117] "RemoveContainer" containerID="1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c" Nov 29 03:42:27 crc kubenswrapper[4749]: E1129 03:42:27.953051 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c\": container with ID starting with 1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c not found: ID does not exist" containerID="1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.953088 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c"} err="failed to get container status \"1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c\": rpc error: code = NotFound desc = could not find container \"1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c\": container with ID starting with 1cd463b2e7c0538e5319ace473739dedad861f369185ba42228cebf9acef034c not found: ID does not exist" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.953112 4749 scope.go:117] "RemoveContainer" containerID="c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284" Nov 29 03:42:27 crc kubenswrapper[4749]: E1129 03:42:27.953385 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284\": container with ID starting with c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284 not found: ID does not exist" containerID="c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284" Nov 29 03:42:27 crc kubenswrapper[4749]: I1129 03:42:27.953425 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284"} err="failed to get container status \"c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284\": rpc error: code = NotFound desc = could not find container \"c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284\": container with ID starting with c04a76d12f8c0b95bd118bbd320190847aea2410a7cbeee7d86ece33c0470284 not found: ID does not exist" Nov 29 03:42:29 crc kubenswrapper[4749]: I1129 03:42:29.093635 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" path="/var/lib/kubelet/pods/f44cbfd4-29d2-4bf5-8744-2ca34aece867/volumes" Nov 29 03:42:44 crc kubenswrapper[4749]: I1129 03:42:44.173193 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 03:42:44 crc kubenswrapper[4749]: I1129 03:42:44.173980 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="74d94086-38fb-4dfa-8cb1-09f2ca302406" containerName="nova-cell0-conductor-conductor" containerID="cri-o://378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106" gracePeriod=30 Nov 29 03:42:44 crc kubenswrapper[4749]: I1129 03:42:44.211510 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 03:42:44 crc kubenswrapper[4749]: I1129 03:42:44.212190 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="78f71145-5cfd-4d13-a7c5-37301910d02b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc" gracePeriod=30 Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.068957 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.069176 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-log" containerID="cri-o://e6bbd4ce061db29771862d9d9e2d044ae3b6847acc9c278038db93b75d7c1598" gracePeriod=30 Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.069401 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-api" containerID="cri-o://e254aab0196ba3756f1af170fc7a3657ba70c5bc2ce0d97642e1beca12843cc9" gracePeriod=30 Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.139968 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.140215 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0e5f3f01-7951-4368-ac89-9e98a03dd5b3" containerName="nova-scheduler-scheduler" containerID="cri-o://463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8" gracePeriod=30 Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.152025 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.152277 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-log" containerID="cri-o://cf7d7fc251dc251e8dadd04344a6d436cee05a333ba8ea5cb11dd5df1561aed9" gracePeriod=30 Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.152410 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-metadata" containerID="cri-o://844b63d7ef81b44bf6d99aac2f89407a540f9f2ec1504d239bcc104aa536427d" gracePeriod=30 Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.860088 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.948573 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-config-data\") pod \"78f71145-5cfd-4d13-a7c5-37301910d02b\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.948744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7pxm\" (UniqueName: \"kubernetes.io/projected/78f71145-5cfd-4d13-a7c5-37301910d02b-kube-api-access-l7pxm\") pod \"78f71145-5cfd-4d13-a7c5-37301910d02b\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.948860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-combined-ca-bundle\") pod \"78f71145-5cfd-4d13-a7c5-37301910d02b\" (UID: \"78f71145-5cfd-4d13-a7c5-37301910d02b\") " Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.953687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f71145-5cfd-4d13-a7c5-37301910d02b-kube-api-access-l7pxm" (OuterVolumeSpecName: "kube-api-access-l7pxm") pod "78f71145-5cfd-4d13-a7c5-37301910d02b" (UID: "78f71145-5cfd-4d13-a7c5-37301910d02b"). InnerVolumeSpecName "kube-api-access-l7pxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.976351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78f71145-5cfd-4d13-a7c5-37301910d02b" (UID: "78f71145-5cfd-4d13-a7c5-37301910d02b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:45 crc kubenswrapper[4749]: I1129 03:42:45.976940 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-config-data" (OuterVolumeSpecName: "config-data") pod "78f71145-5cfd-4d13-a7c5-37301910d02b" (UID: "78f71145-5cfd-4d13-a7c5-37301910d02b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.039232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456ec409-a59d-4080-ac18-96207a4138a6","Type":"ContainerDied","Data":"cf7d7fc251dc251e8dadd04344a6d436cee05a333ba8ea5cb11dd5df1561aed9"} Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.039176 4749 generic.go:334] "Generic (PLEG): container finished" podID="456ec409-a59d-4080-ac18-96207a4138a6" containerID="cf7d7fc251dc251e8dadd04344a6d436cee05a333ba8ea5cb11dd5df1561aed9" exitCode=143 Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.041646 4749 generic.go:334] "Generic (PLEG): container finished" podID="78f71145-5cfd-4d13-a7c5-37301910d02b" containerID="373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc" exitCode=0 Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.041696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"78f71145-5cfd-4d13-a7c5-37301910d02b","Type":"ContainerDied","Data":"373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc"} Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.041716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"78f71145-5cfd-4d13-a7c5-37301910d02b","Type":"ContainerDied","Data":"635158f8754c8ec2e43d7775d34b28b99446a46593795f03bfb5c84da023b568"} Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.041733 4749 scope.go:117] "RemoveContainer" containerID="373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.041744 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.044836 4749 generic.go:334] "Generic (PLEG): container finished" podID="06c8327c-c860-4340-87b5-bc2a939d986a" containerID="e6bbd4ce061db29771862d9d9e2d044ae3b6847acc9c278038db93b75d7c1598" exitCode=143 Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.044859 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"06c8327c-c860-4340-87b5-bc2a939d986a","Type":"ContainerDied","Data":"e6bbd4ce061db29771862d9d9e2d044ae3b6847acc9c278038db93b75d7c1598"} Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.051639 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.051669 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7pxm\" (UniqueName: \"kubernetes.io/projected/78f71145-5cfd-4d13-a7c5-37301910d02b-kube-api-access-l7pxm\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.051685 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f71145-5cfd-4d13-a7c5-37301910d02b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.066433 4749 scope.go:117] "RemoveContainer" containerID="373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc" Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.067031 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc\": container with ID starting with 373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc not found: ID does not exist" containerID="373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.067069 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc"} err="failed to get container status \"373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc\": rpc error: code = NotFound desc = could not find container \"373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc\": container with ID starting with 373719c5fcd73bbe5178b2be9cfbe245f30768107c37aa2cc6cd379dded3cabc not found: ID does not exist" Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.070252 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.071912 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.073478 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.073523 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0e5f3f01-7951-4368-ac89-9e98a03dd5b3" containerName="nova-scheduler-scheduler" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.103812 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.116803 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.127823 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.128335 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f71145-5cfd-4d13-a7c5-37301910d02b" containerName="nova-cell1-conductor-conductor" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.128352 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f71145-5cfd-4d13-a7c5-37301910d02b" containerName="nova-cell1-conductor-conductor" Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.128377 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="extract-utilities" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.128384 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="extract-utilities" Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.128408 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="extract-content" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.128414 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="extract-content" Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.128444 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="registry-server" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.128453 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="registry-server" Nov 29 03:42:46 crc kubenswrapper[4749]: E1129 03:42:46.128487 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a85fb9-4f52-4edb-a999-f5e373694943" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.128493 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a85fb9-4f52-4edb-a999-f5e373694943" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.128701 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f71145-5cfd-4d13-a7c5-37301910d02b" containerName="nova-cell1-conductor-conductor" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.128730 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a85fb9-4f52-4edb-a999-f5e373694943" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.128752 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44cbfd4-29d2-4bf5-8744-2ca34aece867" containerName="registry-server" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.129798 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.131607 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.135915 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.153465 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmxzf\" (UniqueName: \"kubernetes.io/projected/9987255c-ae1e-412a-b2c0-f0043906ccd3-kube-api-access-fmxzf\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.153625 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9987255c-ae1e-412a-b2c0-f0043906ccd3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.153815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9987255c-ae1e-412a-b2c0-f0043906ccd3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.255156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9987255c-ae1e-412a-b2c0-f0043906ccd3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.255271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmxzf\" (UniqueName: \"kubernetes.io/projected/9987255c-ae1e-412a-b2c0-f0043906ccd3-kube-api-access-fmxzf\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.255363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9987255c-ae1e-412a-b2c0-f0043906ccd3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.259793 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9987255c-ae1e-412a-b2c0-f0043906ccd3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.260779 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9987255c-ae1e-412a-b2c0-f0043906ccd3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.276103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmxzf\" (UniqueName: \"kubernetes.io/projected/9987255c-ae1e-412a-b2c0-f0043906ccd3-kube-api-access-fmxzf\") pod \"nova-cell1-conductor-0\" (UID: \"9987255c-ae1e-412a-b2c0-f0043906ccd3\") " pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.471630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:46 crc kubenswrapper[4749]: I1129 03:42:46.983409 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 29 03:42:47 crc kubenswrapper[4749]: I1129 03:42:47.061917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9987255c-ae1e-412a-b2c0-f0043906ccd3","Type":"ContainerStarted","Data":"a53a6eba0de9e7c9d5255380f635763008a392a1f6614e0897f3c266cc24bdba"} Nov 29 03:42:47 crc kubenswrapper[4749]: I1129 03:42:47.096109 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f71145-5cfd-4d13-a7c5-37301910d02b" path="/var/lib/kubelet/pods/78f71145-5cfd-4d13-a7c5-37301910d02b/volumes" Nov 29 03:42:48 crc kubenswrapper[4749]: I1129 03:42:48.086686 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9987255c-ae1e-412a-b2c0-f0043906ccd3","Type":"ContainerStarted","Data":"8ef0a5d45bbb3d44c27ac4701056eabbfe538b109cc9c6d482e0318205530dfd"} Nov 29 03:42:48 crc kubenswrapper[4749]: I1129 03:42:48.087295 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:48 crc kubenswrapper[4749]: I1129 03:42:48.114255 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.114236901 podStartE2EDuration="2.114236901s" podCreationTimestamp="2025-11-29 03:42:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 03:42:48.1063615 +0000 UTC m=+9111.278511367" watchObservedRunningTime="2025-11-29 03:42:48.114236901 +0000 UTC m=+9111.286386758" Nov 29 03:42:48 crc kubenswrapper[4749]: E1129 03:42:48.280732 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 03:42:48 crc kubenswrapper[4749]: E1129 03:42:48.284833 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 03:42:48 crc kubenswrapper[4749]: E1129 03:42:48.287938 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 29 03:42:48 crc kubenswrapper[4749]: E1129 03:42:48.288008 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="74d94086-38fb-4dfa-8cb1-09f2ca302406" containerName="nova-cell0-conductor-conductor" Nov 29 03:42:48 crc kubenswrapper[4749]: I1129 03:42:48.728899 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:50044->10.217.1.83:8775: read: connection reset by peer" Nov 29 03:42:48 crc kubenswrapper[4749]: I1129 03:42:48.728948 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:50056->10.217.1.83:8775: read: connection reset by peer" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.108123 4749 generic.go:334] "Generic (PLEG): container finished" podID="06c8327c-c860-4340-87b5-bc2a939d986a" containerID="e254aab0196ba3756f1af170fc7a3657ba70c5bc2ce0d97642e1beca12843cc9" exitCode=0 Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.108277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"06c8327c-c860-4340-87b5-bc2a939d986a","Type":"ContainerDied","Data":"e254aab0196ba3756f1af170fc7a3657ba70c5bc2ce0d97642e1beca12843cc9"} Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.120701 4749 generic.go:334] "Generic (PLEG): container finished" podID="456ec409-a59d-4080-ac18-96207a4138a6" containerID="844b63d7ef81b44bf6d99aac2f89407a540f9f2ec1504d239bcc104aa536427d" exitCode=0 Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.120773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456ec409-a59d-4080-ac18-96207a4138a6","Type":"ContainerDied","Data":"844b63d7ef81b44bf6d99aac2f89407a540f9f2ec1504d239bcc104aa536427d"} Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.494189 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg"] Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.495967 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.499216 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.501113 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.501533 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.501634 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ghmtb" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.501879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.504418 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.504570 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.516397 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg"] Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.642910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.642997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.643144 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.643219 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.643419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.643468 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqsk\" (UniqueName: \"kubernetes.io/projected/c468b431-5762-4449-8467-64844ca96b2d-kube-api-access-fwqsk\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.643542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.643686 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.643912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.644112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.644182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.745737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.745803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.745851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.745877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.745927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.745960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.745987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.746005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.746056 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.746075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqsk\" (UniqueName: \"kubernetes.io/projected/c468b431-5762-4449-8467-64844ca96b2d-kube-api-access-fwqsk\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.746104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.747253 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.748165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.800885 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.802706 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.802907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.803153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.803599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.803806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.804791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.805296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqsk\" (UniqueName: \"kubernetes.io/projected/c468b431-5762-4449-8467-64844ca96b2d-kube-api-access-fwqsk\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.813585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.922179 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.929521 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:42:49 crc kubenswrapper[4749]: I1129 03:42:49.937940 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.052988 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06c8327c-c860-4340-87b5-bc2a939d986a-logs\") pod \"06c8327c-c860-4340-87b5-bc2a939d986a\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.053130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqjws\" (UniqueName: \"kubernetes.io/projected/456ec409-a59d-4080-ac18-96207a4138a6-kube-api-access-lqjws\") pod \"456ec409-a59d-4080-ac18-96207a4138a6\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.053226 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-combined-ca-bundle\") pod \"06c8327c-c860-4340-87b5-bc2a939d986a\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.053346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-config-data\") pod \"06c8327c-c860-4340-87b5-bc2a939d986a\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.053389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-combined-ca-bundle\") pod \"456ec409-a59d-4080-ac18-96207a4138a6\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.053420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx9hl\" (UniqueName: \"kubernetes.io/projected/06c8327c-c860-4340-87b5-bc2a939d986a-kube-api-access-hx9hl\") pod \"06c8327c-c860-4340-87b5-bc2a939d986a\" (UID: \"06c8327c-c860-4340-87b5-bc2a939d986a\") " Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.053480 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-config-data\") pod \"456ec409-a59d-4080-ac18-96207a4138a6\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.053584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456ec409-a59d-4080-ac18-96207a4138a6-logs\") pod \"456ec409-a59d-4080-ac18-96207a4138a6\" (UID: \"456ec409-a59d-4080-ac18-96207a4138a6\") " Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.053882 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06c8327c-c860-4340-87b5-bc2a939d986a-logs" (OuterVolumeSpecName: "logs") pod "06c8327c-c860-4340-87b5-bc2a939d986a" (UID: "06c8327c-c860-4340-87b5-bc2a939d986a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.055168 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06c8327c-c860-4340-87b5-bc2a939d986a-logs\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.073581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/456ec409-a59d-4080-ac18-96207a4138a6-logs" (OuterVolumeSpecName: "logs") pod "456ec409-a59d-4080-ac18-96207a4138a6" (UID: "456ec409-a59d-4080-ac18-96207a4138a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.097918 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/456ec409-a59d-4080-ac18-96207a4138a6-kube-api-access-lqjws" (OuterVolumeSpecName: "kube-api-access-lqjws") pod "456ec409-a59d-4080-ac18-96207a4138a6" (UID: "456ec409-a59d-4080-ac18-96207a4138a6"). InnerVolumeSpecName "kube-api-access-lqjws". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.117683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06c8327c-c860-4340-87b5-bc2a939d986a-kube-api-access-hx9hl" (OuterVolumeSpecName: "kube-api-access-hx9hl") pod "06c8327c-c860-4340-87b5-bc2a939d986a" (UID: "06c8327c-c860-4340-87b5-bc2a939d986a"). InnerVolumeSpecName "kube-api-access-hx9hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.142399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06c8327c-c860-4340-87b5-bc2a939d986a" (UID: "06c8327c-c860-4340-87b5-bc2a939d986a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.159489 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/456ec409-a59d-4080-ac18-96207a4138a6-logs\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.159519 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqjws\" (UniqueName: \"kubernetes.io/projected/456ec409-a59d-4080-ac18-96207a4138a6-kube-api-access-lqjws\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.159533 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.159542 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx9hl\" (UniqueName: \"kubernetes.io/projected/06c8327c-c860-4340-87b5-bc2a939d986a-kube-api-access-hx9hl\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.163298 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "456ec409-a59d-4080-ac18-96207a4138a6" (UID: "456ec409-a59d-4080-ac18-96207a4138a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.166761 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"06c8327c-c860-4340-87b5-bc2a939d986a","Type":"ContainerDied","Data":"a931130275e40f02afd9bbd08d46fd1060a3a4d349b19d7f6bd6e07f99df4612"} Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.166812 4749 scope.go:117] "RemoveContainer" containerID="e254aab0196ba3756f1af170fc7a3657ba70c5bc2ce0d97642e1beca12843cc9" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.167125 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.171667 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-config-data" (OuterVolumeSpecName: "config-data") pod "06c8327c-c860-4340-87b5-bc2a939d986a" (UID: "06c8327c-c860-4340-87b5-bc2a939d986a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.180714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"456ec409-a59d-4080-ac18-96207a4138a6","Type":"ContainerDied","Data":"55ab4cbb11dc82de3a38da352b98dfc4bafa1c3eb092dccc874476d811c00176"} Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.180852 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.214017 4749 scope.go:117] "RemoveContainer" containerID="e6bbd4ce061db29771862d9d9e2d044ae3b6847acc9c278038db93b75d7c1598" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.217337 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-config-data" (OuterVolumeSpecName: "config-data") pod "456ec409-a59d-4080-ac18-96207a4138a6" (UID: "456ec409-a59d-4080-ac18-96207a4138a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.280333 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06c8327c-c860-4340-87b5-bc2a939d986a-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.280374 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.280389 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/456ec409-a59d-4080-ac18-96207a4138a6-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.309533 4749 scope.go:117] "RemoveContainer" containerID="844b63d7ef81b44bf6d99aac2f89407a540f9f2ec1504d239bcc104aa536427d" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.332718 4749 scope.go:117] "RemoveContainer" containerID="cf7d7fc251dc251e8dadd04344a6d436cee05a333ba8ea5cb11dd5df1561aed9" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.521258 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.530639 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.544888 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.566354 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 29 03:42:50 crc kubenswrapper[4749]: E1129 03:42:50.566875 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-api" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.566892 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-api" Nov 29 03:42:50 crc kubenswrapper[4749]: E1129 03:42:50.566912 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-log" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.566919 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-log" Nov 29 03:42:50 crc kubenswrapper[4749]: E1129 03:42:50.566931 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-log" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.566938 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-log" Nov 29 03:42:50 crc kubenswrapper[4749]: E1129 03:42:50.566956 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-metadata" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.566965 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-metadata" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.567168 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-api" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.567186 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" containerName="nova-api-log" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.567241 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-metadata" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.567259 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="456ec409-a59d-4080-ac18-96207a4138a6" containerName="nova-metadata-log" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.568468 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.570565 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.583706 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.593727 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.609126 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.610940 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.615182 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.623581 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.687344 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e95981-686e-470b-b989-aedec673798b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.687824 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbnjg\" (UniqueName: \"kubernetes.io/projected/10e95981-686e-470b-b989-aedec673798b-kube-api-access-qbnjg\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.688098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e95981-686e-470b-b989-aedec673798b-config-data\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.688288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e95981-686e-470b-b989-aedec673798b-logs\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: W1129 03:42:50.708527 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc468b431_5762_4449_8467_64844ca96b2d.slice/crio-29756bb87e5b63e3dc54d27749f74ccbddef13cb7fb6c4f954709250a95ad433 WatchSource:0}: Error finding container 29756bb87e5b63e3dc54d27749f74ccbddef13cb7fb6c4f954709250a95ad433: Status 404 returned error can't find the container with id 29756bb87e5b63e3dc54d27749f74ccbddef13cb7fb6c4f954709250a95ad433 Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.710707 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.718960 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg"] Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.790248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8s8b\" (UniqueName: \"kubernetes.io/projected/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-kube-api-access-m8s8b\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.790398 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbnjg\" (UniqueName: \"kubernetes.io/projected/10e95981-686e-470b-b989-aedec673798b-kube-api-access-qbnjg\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.790508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-logs\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.790603 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e95981-686e-470b-b989-aedec673798b-config-data\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.790673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.790745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-config-data\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.790855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e95981-686e-470b-b989-aedec673798b-logs\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.790943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e95981-686e-470b-b989-aedec673798b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.792110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10e95981-686e-470b-b989-aedec673798b-logs\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.795474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e95981-686e-470b-b989-aedec673798b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.795742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10e95981-686e-470b-b989-aedec673798b-config-data\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.813444 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbnjg\" (UniqueName: \"kubernetes.io/projected/10e95981-686e-470b-b989-aedec673798b-kube-api-access-qbnjg\") pod \"nova-api-0\" (UID: \"10e95981-686e-470b-b989-aedec673798b\") " pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.898250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-logs\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.898332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.898383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-config-data\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.900113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-logs\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.900871 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8s8b\" (UniqueName: \"kubernetes.io/projected/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-kube-api-access-m8s8b\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.903640 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-config-data\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.903954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.909584 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.921428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8s8b\" (UniqueName: \"kubernetes.io/projected/b4cc0ff4-0c4e-4ad0-8e15-8758c486221d-kube-api-access-m8s8b\") pod \"nova-metadata-0\" (UID: \"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d\") " pod="openstack/nova-metadata-0" Nov 29 03:42:50 crc kubenswrapper[4749]: I1129 03:42:50.943676 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.032266 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.104355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-combined-ca-bundle\") pod \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.104471 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqs4n\" (UniqueName: \"kubernetes.io/projected/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-kube-api-access-lqs4n\") pod \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.104516 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-config-data\") pod \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\" (UID: \"0e5f3f01-7951-4368-ac89-9e98a03dd5b3\") " Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.113586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-kube-api-access-lqs4n" (OuterVolumeSpecName: "kube-api-access-lqs4n") pod "0e5f3f01-7951-4368-ac89-9e98a03dd5b3" (UID: "0e5f3f01-7951-4368-ac89-9e98a03dd5b3"). InnerVolumeSpecName "kube-api-access-lqs4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.124997 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06c8327c-c860-4340-87b5-bc2a939d986a" path="/var/lib/kubelet/pods/06c8327c-c860-4340-87b5-bc2a939d986a/volumes" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.126017 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="456ec409-a59d-4080-ac18-96207a4138a6" path="/var/lib/kubelet/pods/456ec409-a59d-4080-ac18-96207a4138a6/volumes" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.147568 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e5f3f01-7951-4368-ac89-9e98a03dd5b3" (UID: "0e5f3f01-7951-4368-ac89-9e98a03dd5b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.149160 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-config-data" (OuterVolumeSpecName: "config-data") pod "0e5f3f01-7951-4368-ac89-9e98a03dd5b3" (UID: "0e5f3f01-7951-4368-ac89-9e98a03dd5b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.198918 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" event={"ID":"c468b431-5762-4449-8467-64844ca96b2d","Type":"ContainerStarted","Data":"29756bb87e5b63e3dc54d27749f74ccbddef13cb7fb6c4f954709250a95ad433"} Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.200996 4749 generic.go:334] "Generic (PLEG): container finished" podID="0e5f3f01-7951-4368-ac89-9e98a03dd5b3" containerID="463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8" exitCode=0 Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.201056 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e5f3f01-7951-4368-ac89-9e98a03dd5b3","Type":"ContainerDied","Data":"463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8"} Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.201076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e5f3f01-7951-4368-ac89-9e98a03dd5b3","Type":"ContainerDied","Data":"a0cbbee9627b1e1039a4353476fd279a19d26496a2c0b42ebf12a2afadb5d436"} Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.201095 4749 scope.go:117] "RemoveContainer" containerID="463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.201230 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.206625 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.206653 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqs4n\" (UniqueName: \"kubernetes.io/projected/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-kube-api-access-lqs4n\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.206662 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5f3f01-7951-4368-ac89-9e98a03dd5b3-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.232458 4749 scope.go:117] "RemoveContainer" containerID="463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8" Nov 29 03:42:51 crc kubenswrapper[4749]: E1129 03:42:51.232797 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8\": container with ID starting with 463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8 not found: ID does not exist" containerID="463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.232823 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8"} err="failed to get container status \"463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8\": rpc error: code = NotFound desc = could not find container \"463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8\": container with ID starting with 463ab19e490a3bb8503a68b6950694b1e4e545f8f574fbcce45410dc4d58d2f8 not found: ID does not exist" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.245675 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.257355 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.264622 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 03:42:51 crc kubenswrapper[4749]: E1129 03:42:51.265072 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5f3f01-7951-4368-ac89-9e98a03dd5b3" containerName="nova-scheduler-scheduler" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.265091 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5f3f01-7951-4368-ac89-9e98a03dd5b3" containerName="nova-scheduler-scheduler" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.265296 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5f3f01-7951-4368-ac89-9e98a03dd5b3" containerName="nova-scheduler-scheduler" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.265978 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.269159 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.273629 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.308926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06cfd47d-b269-4f6e-aea7-aaa037d7375b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.309016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06cfd47d-b269-4f6e-aea7-aaa037d7375b-config-data\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.309207 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k5qq\" (UniqueName: \"kubernetes.io/projected/06cfd47d-b269-4f6e-aea7-aaa037d7375b-kube-api-access-2k5qq\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.411048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k5qq\" (UniqueName: \"kubernetes.io/projected/06cfd47d-b269-4f6e-aea7-aaa037d7375b-kube-api-access-2k5qq\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.411117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06cfd47d-b269-4f6e-aea7-aaa037d7375b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.411223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06cfd47d-b269-4f6e-aea7-aaa037d7375b-config-data\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.414147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06cfd47d-b269-4f6e-aea7-aaa037d7375b-config-data\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.415921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06cfd47d-b269-4f6e-aea7-aaa037d7375b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.427000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k5qq\" (UniqueName: \"kubernetes.io/projected/06cfd47d-b269-4f6e-aea7-aaa037d7375b-kube-api-access-2k5qq\") pod \"nova-scheduler-0\" (UID: \"06cfd47d-b269-4f6e-aea7-aaa037d7375b\") " pod="openstack/nova-scheduler-0" Nov 29 03:42:51 crc kubenswrapper[4749]: W1129 03:42:51.528963 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10e95981_686e_470b_b989_aedec673798b.slice/crio-59b307eda6f74be65f6915f621c2ca6953fb4bd2420878757cecbd28efc7fe89 WatchSource:0}: Error finding container 59b307eda6f74be65f6915f621c2ca6953fb4bd2420878757cecbd28efc7fe89: Status 404 returned error can't find the container with id 59b307eda6f74be65f6915f621c2ca6953fb4bd2420878757cecbd28efc7fe89 Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.531018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.583350 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 29 03:42:51 crc kubenswrapper[4749]: I1129 03:42:51.586368 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.048253 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.218087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10e95981-686e-470b-b989-aedec673798b","Type":"ContainerStarted","Data":"eec0319c423a62450f21bdedd1058fa6f43db17525db734f623046c4acd5cecc"} Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.218129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10e95981-686e-470b-b989-aedec673798b","Type":"ContainerStarted","Data":"5bd768cb0449716918e80764a3caddb3cb66412406466871193f23ba47cc78b8"} Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.218140 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10e95981-686e-470b-b989-aedec673798b","Type":"ContainerStarted","Data":"59b307eda6f74be65f6915f621c2ca6953fb4bd2420878757cecbd28efc7fe89"} Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.219891 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" event={"ID":"c468b431-5762-4449-8467-64844ca96b2d","Type":"ContainerStarted","Data":"701dc8a2bba419840f64c8343a9ea67baea457bab4bcc6607df33f362124e4e4"} Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.221439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06cfd47d-b269-4f6e-aea7-aaa037d7375b","Type":"ContainerStarted","Data":"e4c1da910bf7824a2c52a0d8bd0f6d01ec41c165f904a0b170da568a2fe47013"} Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.224267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d","Type":"ContainerStarted","Data":"5f049abf32b6216c18ad901c2b2fa610e41eeb4177d504fca01e167e0e36c0f6"} Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.224317 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d","Type":"ContainerStarted","Data":"43149b57fce8d9da949872ec73f08dd541e2c29ccb5d0382608754bdd901609a"} Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.224333 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4cc0ff4-0c4e-4ad0-8e15-8758c486221d","Type":"ContainerStarted","Data":"c2510d0db2f8f3d8bb3d418cd5c28ff034c08894291b34fed6fcc819e5786f9f"} Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.264896 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.264873727 podStartE2EDuration="2.264873727s" podCreationTimestamp="2025-11-29 03:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 03:42:52.236859096 +0000 UTC m=+9115.409008953" watchObservedRunningTime="2025-11-29 03:42:52.264873727 +0000 UTC m=+9115.437023594" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.286011 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.285993561 podStartE2EDuration="2.285993561s" podCreationTimestamp="2025-11-29 03:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 03:42:52.259055356 +0000 UTC m=+9115.431205223" watchObservedRunningTime="2025-11-29 03:42:52.285993561 +0000 UTC m=+9115.458143418" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.300783 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.300767061 podStartE2EDuration="1.300767061s" podCreationTimestamp="2025-11-29 03:42:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 03:42:52.273653811 +0000 UTC m=+9115.445803668" watchObservedRunningTime="2025-11-29 03:42:52.300767061 +0000 UTC m=+9115.472916918" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.327871 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" podStartSLOduration=2.66405401 podStartE2EDuration="3.327836589s" podCreationTimestamp="2025-11-29 03:42:49 +0000 UTC" firstStartedPulling="2025-11-29 03:42:50.710533833 +0000 UTC m=+9113.882683690" lastFinishedPulling="2025-11-29 03:42:51.374316412 +0000 UTC m=+9114.546466269" observedRunningTime="2025-11-29 03:42:52.295320128 +0000 UTC m=+9115.467469985" watchObservedRunningTime="2025-11-29 03:42:52.327836589 +0000 UTC m=+9115.499986446" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.686053 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.751697 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfkz9\" (UniqueName: \"kubernetes.io/projected/74d94086-38fb-4dfa-8cb1-09f2ca302406-kube-api-access-dfkz9\") pod \"74d94086-38fb-4dfa-8cb1-09f2ca302406\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.751753 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-config-data\") pod \"74d94086-38fb-4dfa-8cb1-09f2ca302406\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.751896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-combined-ca-bundle\") pod \"74d94086-38fb-4dfa-8cb1-09f2ca302406\" (UID: \"74d94086-38fb-4dfa-8cb1-09f2ca302406\") " Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.761126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d94086-38fb-4dfa-8cb1-09f2ca302406-kube-api-access-dfkz9" (OuterVolumeSpecName: "kube-api-access-dfkz9") pod "74d94086-38fb-4dfa-8cb1-09f2ca302406" (UID: "74d94086-38fb-4dfa-8cb1-09f2ca302406"). InnerVolumeSpecName "kube-api-access-dfkz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.791674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74d94086-38fb-4dfa-8cb1-09f2ca302406" (UID: "74d94086-38fb-4dfa-8cb1-09f2ca302406"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.796017 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-config-data" (OuterVolumeSpecName: "config-data") pod "74d94086-38fb-4dfa-8cb1-09f2ca302406" (UID: "74d94086-38fb-4dfa-8cb1-09f2ca302406"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.853891 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.853927 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfkz9\" (UniqueName: \"kubernetes.io/projected/74d94086-38fb-4dfa-8cb1-09f2ca302406-kube-api-access-dfkz9\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:52 crc kubenswrapper[4749]: I1129 03:42:52.853940 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d94086-38fb-4dfa-8cb1-09f2ca302406-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.094225 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e5f3f01-7951-4368-ac89-9e98a03dd5b3" path="/var/lib/kubelet/pods/0e5f3f01-7951-4368-ac89-9e98a03dd5b3/volumes" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.239299 4749 generic.go:334] "Generic (PLEG): container finished" podID="74d94086-38fb-4dfa-8cb1-09f2ca302406" containerID="378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106" exitCode=0 Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.239368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"74d94086-38fb-4dfa-8cb1-09f2ca302406","Type":"ContainerDied","Data":"378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106"} Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.239400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"74d94086-38fb-4dfa-8cb1-09f2ca302406","Type":"ContainerDied","Data":"1b120063e6e64087c5cd8e124e6308c915122ef3d85daa6570286150913e5abd"} Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.239421 4749 scope.go:117] "RemoveContainer" containerID="378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.239548 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.245112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06cfd47d-b269-4f6e-aea7-aaa037d7375b","Type":"ContainerStarted","Data":"44ea4b08809c61897f145c48136c76632996b39fc7b2f002614493f3e0f3f917"} Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.287500 4749 scope.go:117] "RemoveContainer" containerID="378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106" Nov 29 03:42:53 crc kubenswrapper[4749]: E1129 03:42:53.288063 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106\": container with ID starting with 378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106 not found: ID does not exist" containerID="378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.288117 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106"} err="failed to get container status \"378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106\": rpc error: code = NotFound desc = could not find container \"378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106\": container with ID starting with 378df139bf4d31aca115b2d44b3b399a363a99e963479f124618b5d237e43106 not found: ID does not exist" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.295017 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.314405 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.325255 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 03:42:53 crc kubenswrapper[4749]: E1129 03:42:53.325844 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d94086-38fb-4dfa-8cb1-09f2ca302406" containerName="nova-cell0-conductor-conductor" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.325865 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d94086-38fb-4dfa-8cb1-09f2ca302406" containerName="nova-cell0-conductor-conductor" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.326190 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d94086-38fb-4dfa-8cb1-09f2ca302406" containerName="nova-cell0-conductor-conductor" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.327153 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.330557 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.334004 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.468054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qpqx\" (UniqueName: \"kubernetes.io/projected/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-kube-api-access-6qpqx\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.468123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.468146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.569848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qpqx\" (UniqueName: \"kubernetes.io/projected/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-kube-api-access-6qpqx\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.570417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.570452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.575441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.586863 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.587492 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qpqx\" (UniqueName: \"kubernetes.io/projected/b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f-kube-api-access-6qpqx\") pod \"nova-cell0-conductor-0\" (UID: \"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f\") " pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:53 crc kubenswrapper[4749]: I1129 03:42:53.651990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:54 crc kubenswrapper[4749]: I1129 03:42:54.208879 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 29 03:42:54 crc kubenswrapper[4749]: W1129 03:42:54.216620 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb31dbaba_8a6f_4e88_8b0f_4d3ea583ef7f.slice/crio-34e2cc3858fb6068239d2cc0bd2f3b668e88177d7b6546b3b407d27b3501c0f3 WatchSource:0}: Error finding container 34e2cc3858fb6068239d2cc0bd2f3b668e88177d7b6546b3b407d27b3501c0f3: Status 404 returned error can't find the container with id 34e2cc3858fb6068239d2cc0bd2f3b668e88177d7b6546b3b407d27b3501c0f3 Nov 29 03:42:54 crc kubenswrapper[4749]: I1129 03:42:54.253480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f","Type":"ContainerStarted","Data":"34e2cc3858fb6068239d2cc0bd2f3b668e88177d7b6546b3b407d27b3501c0f3"} Nov 29 03:42:55 crc kubenswrapper[4749]: I1129 03:42:55.086565 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d94086-38fb-4dfa-8cb1-09f2ca302406" path="/var/lib/kubelet/pods/74d94086-38fb-4dfa-8cb1-09f2ca302406/volumes" Nov 29 03:42:55 crc kubenswrapper[4749]: I1129 03:42:55.273520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f","Type":"ContainerStarted","Data":"66e83767080c7e57dc52756b0f30a2fc2d171aa86903848959d4e57a537cb6f8"} Nov 29 03:42:55 crc kubenswrapper[4749]: I1129 03:42:55.274477 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 29 03:42:55 crc kubenswrapper[4749]: I1129 03:42:55.308315 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.308293027 podStartE2EDuration="2.308293027s" podCreationTimestamp="2025-11-29 03:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 03:42:55.299413791 +0000 UTC m=+9118.471563648" watchObservedRunningTime="2025-11-29 03:42:55.308293027 +0000 UTC m=+9118.480442884" Nov 29 03:42:55 crc kubenswrapper[4749]: I1129 03:42:55.374422 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:42:55 crc kubenswrapper[4749]: I1129 03:42:55.374492 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:42:55 crc kubenswrapper[4749]: I1129 03:42:55.944666 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 03:42:55 crc kubenswrapper[4749]: I1129 03:42:55.945894 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 29 03:42:56 crc kubenswrapper[4749]: I1129 03:42:56.508872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 29 03:42:56 crc kubenswrapper[4749]: I1129 03:42:56.586609 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 29 03:43:00 crc kubenswrapper[4749]: I1129 03:43:00.910468 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 03:43:00 crc kubenswrapper[4749]: I1129 03:43:00.911054 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 29 03:43:00 crc kubenswrapper[4749]: I1129 03:43:00.944703 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 03:43:00 crc kubenswrapper[4749]: I1129 03:43:00.944788 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 29 03:43:01 crc kubenswrapper[4749]: I1129 03:43:01.587145 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 29 03:43:01 crc kubenswrapper[4749]: I1129 03:43:01.621840 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 29 03:43:01 crc kubenswrapper[4749]: I1129 03:43:01.992445 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="10e95981-686e-470b-b989-aedec673798b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 03:43:02 crc kubenswrapper[4749]: I1129 03:43:02.075328 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="10e95981-686e-470b-b989-aedec673798b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 03:43:02 crc kubenswrapper[4749]: I1129 03:43:02.075611 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b4cc0ff4-0c4e-4ad0-8e15-8758c486221d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.190:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 03:43:02 crc kubenswrapper[4749]: I1129 03:43:02.075636 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b4cc0ff4-0c4e-4ad0-8e15-8758c486221d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.190:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 29 03:43:02 crc kubenswrapper[4749]: I1129 03:43:02.404058 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 29 03:43:03 crc kubenswrapper[4749]: I1129 03:43:03.709742 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.915903 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.916863 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.917299 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.917384 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.922026 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.927082 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.946968 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.947720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 29 03:43:10 crc kubenswrapper[4749]: I1129 03:43:10.948781 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 03:43:11 crc kubenswrapper[4749]: I1129 03:43:11.474698 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.374147 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.374806 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.374874 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.376063 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.376178 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" gracePeriod=600 Nov 29 03:43:25 crc kubenswrapper[4749]: E1129 03:43:25.509022 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.640486 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" exitCode=0 Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.640517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea"} Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.640563 4749 scope.go:117] "RemoveContainer" containerID="ab0f4311a913e2b86c25c5800f847627aee2740cd1de1da90d15b6b9d0517539" Nov 29 03:43:25 crc kubenswrapper[4749]: I1129 03:43:25.641304 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:43:25 crc kubenswrapper[4749]: E1129 03:43:25.641589 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:43:40 crc kubenswrapper[4749]: I1129 03:43:40.075800 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:43:40 crc kubenswrapper[4749]: E1129 03:43:40.076608 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:43:51 crc kubenswrapper[4749]: I1129 03:43:51.078948 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:43:51 crc kubenswrapper[4749]: E1129 03:43:51.080104 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:44:06 crc kubenswrapper[4749]: I1129 03:44:06.075919 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:44:06 crc kubenswrapper[4749]: E1129 03:44:06.077925 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:44:17 crc kubenswrapper[4749]: I1129 03:44:17.094488 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:44:17 crc kubenswrapper[4749]: E1129 03:44:17.096073 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:44:30 crc kubenswrapper[4749]: I1129 03:44:30.074784 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:44:30 crc kubenswrapper[4749]: E1129 03:44:30.076513 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:44:45 crc kubenswrapper[4749]: I1129 03:44:45.076957 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:44:45 crc kubenswrapper[4749]: E1129 03:44:45.077862 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.076290 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:45:00 crc kubenswrapper[4749]: E1129 03:45:00.077532 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.162240 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c"] Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.164866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.168111 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.168498 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.178238 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c"] Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.308683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad77d0cd-2bb9-436f-8921-3d5650006193-secret-volume\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.308937 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad77d0cd-2bb9-436f-8921-3d5650006193-config-volume\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.309274 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfjc\" (UniqueName: \"kubernetes.io/projected/ad77d0cd-2bb9-436f-8921-3d5650006193-kube-api-access-znfjc\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.411794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znfjc\" (UniqueName: \"kubernetes.io/projected/ad77d0cd-2bb9-436f-8921-3d5650006193-kube-api-access-znfjc\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.412596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad77d0cd-2bb9-436f-8921-3d5650006193-secret-volume\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.412801 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad77d0cd-2bb9-436f-8921-3d5650006193-config-volume\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.413940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad77d0cd-2bb9-436f-8921-3d5650006193-config-volume\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.421186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad77d0cd-2bb9-436f-8921-3d5650006193-secret-volume\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.430451 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfjc\" (UniqueName: \"kubernetes.io/projected/ad77d0cd-2bb9-436f-8921-3d5650006193-kube-api-access-znfjc\") pod \"collect-profiles-29406465-mbv6c\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:00 crc kubenswrapper[4749]: I1129 03:45:00.496375 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:01 crc kubenswrapper[4749]: I1129 03:45:01.072578 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c"] Nov 29 03:45:01 crc kubenswrapper[4749]: I1129 03:45:01.960986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" event={"ID":"ad77d0cd-2bb9-436f-8921-3d5650006193","Type":"ContainerStarted","Data":"048c35fcb9abba72391c2c156b31ca192e70583fe95c4ed962d5dbb52d6a42e3"} Nov 29 03:45:01 crc kubenswrapper[4749]: I1129 03:45:01.962677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" event={"ID":"ad77d0cd-2bb9-436f-8921-3d5650006193","Type":"ContainerStarted","Data":"0aa7538dfd6f6b005a6e2f21cbc382320a707757355446a10382493439a94a80"} Nov 29 03:45:02 crc kubenswrapper[4749]: I1129 03:45:02.977524 4749 generic.go:334] "Generic (PLEG): container finished" podID="ad77d0cd-2bb9-436f-8921-3d5650006193" containerID="048c35fcb9abba72391c2c156b31ca192e70583fe95c4ed962d5dbb52d6a42e3" exitCode=0 Nov 29 03:45:02 crc kubenswrapper[4749]: I1129 03:45:02.978025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" event={"ID":"ad77d0cd-2bb9-436f-8921-3d5650006193","Type":"ContainerDied","Data":"048c35fcb9abba72391c2c156b31ca192e70583fe95c4ed962d5dbb52d6a42e3"} Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.370208 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.516814 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znfjc\" (UniqueName: \"kubernetes.io/projected/ad77d0cd-2bb9-436f-8921-3d5650006193-kube-api-access-znfjc\") pod \"ad77d0cd-2bb9-436f-8921-3d5650006193\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.517003 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad77d0cd-2bb9-436f-8921-3d5650006193-secret-volume\") pod \"ad77d0cd-2bb9-436f-8921-3d5650006193\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.517030 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad77d0cd-2bb9-436f-8921-3d5650006193-config-volume\") pod \"ad77d0cd-2bb9-436f-8921-3d5650006193\" (UID: \"ad77d0cd-2bb9-436f-8921-3d5650006193\") " Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.518032 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad77d0cd-2bb9-436f-8921-3d5650006193-config-volume" (OuterVolumeSpecName: "config-volume") pod "ad77d0cd-2bb9-436f-8921-3d5650006193" (UID: "ad77d0cd-2bb9-436f-8921-3d5650006193"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.523692 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad77d0cd-2bb9-436f-8921-3d5650006193-kube-api-access-znfjc" (OuterVolumeSpecName: "kube-api-access-znfjc") pod "ad77d0cd-2bb9-436f-8921-3d5650006193" (UID: "ad77d0cd-2bb9-436f-8921-3d5650006193"). InnerVolumeSpecName "kube-api-access-znfjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.524328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad77d0cd-2bb9-436f-8921-3d5650006193-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ad77d0cd-2bb9-436f-8921-3d5650006193" (UID: "ad77d0cd-2bb9-436f-8921-3d5650006193"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.619588 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znfjc\" (UniqueName: \"kubernetes.io/projected/ad77d0cd-2bb9-436f-8921-3d5650006193-kube-api-access-znfjc\") on node \"crc\" DevicePath \"\"" Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.619621 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad77d0cd-2bb9-436f-8921-3d5650006193-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 03:45:04 crc kubenswrapper[4749]: I1129 03:45:04.619635 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad77d0cd-2bb9-436f-8921-3d5650006193-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 03:45:05 crc kubenswrapper[4749]: I1129 03:45:05.005126 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" event={"ID":"ad77d0cd-2bb9-436f-8921-3d5650006193","Type":"ContainerDied","Data":"0aa7538dfd6f6b005a6e2f21cbc382320a707757355446a10382493439a94a80"} Nov 29 03:45:05 crc kubenswrapper[4749]: I1129 03:45:05.005186 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa7538dfd6f6b005a6e2f21cbc382320a707757355446a10382493439a94a80" Nov 29 03:45:05 crc kubenswrapper[4749]: I1129 03:45:05.005271 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406465-mbv6c" Nov 29 03:45:05 crc kubenswrapper[4749]: I1129 03:45:05.086402 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9"] Nov 29 03:45:05 crc kubenswrapper[4749]: I1129 03:45:05.091804 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406420-hzsm9"] Nov 29 03:45:07 crc kubenswrapper[4749]: I1129 03:45:07.089614 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2178e763-77a7-4221-a7e4-db0afbfe42c8" path="/var/lib/kubelet/pods/2178e763-77a7-4221-a7e4-db0afbfe42c8/volumes" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.074812 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:45:12 crc kubenswrapper[4749]: E1129 03:45:12.075648 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.579460 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rttt9"] Nov 29 03:45:12 crc kubenswrapper[4749]: E1129 03:45:12.580875 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad77d0cd-2bb9-436f-8921-3d5650006193" containerName="collect-profiles" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.581083 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad77d0cd-2bb9-436f-8921-3d5650006193" containerName="collect-profiles" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.581941 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad77d0cd-2bb9-436f-8921-3d5650006193" containerName="collect-profiles" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.585890 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.599694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rttt9"] Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.725136 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-utilities\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.725285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-catalog-content\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.725412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97phk\" (UniqueName: \"kubernetes.io/projected/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-kube-api-access-97phk\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.827928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-utilities\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.828157 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-catalog-content\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.828223 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97phk\" (UniqueName: \"kubernetes.io/projected/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-kube-api-access-97phk\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.828607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-utilities\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.828811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-catalog-content\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.902357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97phk\" (UniqueName: \"kubernetes.io/projected/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-kube-api-access-97phk\") pod \"community-operators-rttt9\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:12 crc kubenswrapper[4749]: I1129 03:45:12.922990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:13 crc kubenswrapper[4749]: I1129 03:45:13.263761 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rttt9"] Nov 29 03:45:14 crc kubenswrapper[4749]: I1129 03:45:14.147282 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerID="98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1" exitCode=0 Nov 29 03:45:14 crc kubenswrapper[4749]: I1129 03:45:14.148983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rttt9" event={"ID":"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1","Type":"ContainerDied","Data":"98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1"} Nov 29 03:45:14 crc kubenswrapper[4749]: I1129 03:45:14.149008 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rttt9" event={"ID":"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1","Type":"ContainerStarted","Data":"3aaaab26c7682198ae545299f3e5c79ffa03feb5eb5c2d17f832428f1f59b19a"} Nov 29 03:45:16 crc kubenswrapper[4749]: I1129 03:45:16.169869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rttt9" event={"ID":"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1","Type":"ContainerStarted","Data":"9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef"} Nov 29 03:45:17 crc kubenswrapper[4749]: I1129 03:45:17.184799 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerID="9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef" exitCode=0 Nov 29 03:45:17 crc kubenswrapper[4749]: I1129 03:45:17.184880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rttt9" event={"ID":"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1","Type":"ContainerDied","Data":"9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef"} Nov 29 03:45:18 crc kubenswrapper[4749]: I1129 03:45:18.203583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rttt9" event={"ID":"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1","Type":"ContainerStarted","Data":"438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4"} Nov 29 03:45:18 crc kubenswrapper[4749]: I1129 03:45:18.238155 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rttt9" podStartSLOduration=2.784182715 podStartE2EDuration="6.238129458s" podCreationTimestamp="2025-11-29 03:45:12 +0000 UTC" firstStartedPulling="2025-11-29 03:45:14.159133453 +0000 UTC m=+9257.331283310" lastFinishedPulling="2025-11-29 03:45:17.613080156 +0000 UTC m=+9260.785230053" observedRunningTime="2025-11-29 03:45:18.224572887 +0000 UTC m=+9261.396722784" watchObservedRunningTime="2025-11-29 03:45:18.238129458 +0000 UTC m=+9261.410279355" Nov 29 03:45:22 crc kubenswrapper[4749]: I1129 03:45:22.923757 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:22 crc kubenswrapper[4749]: I1129 03:45:22.924608 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:23 crc kubenswrapper[4749]: I1129 03:45:23.005293 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:23 crc kubenswrapper[4749]: I1129 03:45:23.351621 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:23 crc kubenswrapper[4749]: I1129 03:45:23.418949 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rttt9"] Nov 29 03:45:25 crc kubenswrapper[4749]: I1129 03:45:25.306902 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rttt9" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerName="registry-server" containerID="cri-o://438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4" gracePeriod=2 Nov 29 03:45:25 crc kubenswrapper[4749]: I1129 03:45:25.901485 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.049962 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-catalog-content\") pod \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.050054 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-utilities\") pod \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.050274 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97phk\" (UniqueName: \"kubernetes.io/projected/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-kube-api-access-97phk\") pod \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\" (UID: \"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1\") " Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.051639 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-utilities" (OuterVolumeSpecName: "utilities") pod "0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" (UID: "0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.058344 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-kube-api-access-97phk" (OuterVolumeSpecName: "kube-api-access-97phk") pod "0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" (UID: "0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1"). InnerVolumeSpecName "kube-api-access-97phk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.076159 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:45:26 crc kubenswrapper[4749]: E1129 03:45:26.076490 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.104074 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" (UID: "0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.153961 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97phk\" (UniqueName: \"kubernetes.io/projected/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-kube-api-access-97phk\") on node \"crc\" DevicePath \"\"" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.154001 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.154113 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.320364 4749 generic.go:334] "Generic (PLEG): container finished" podID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerID="438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4" exitCode=0 Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.320422 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rttt9" event={"ID":"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1","Type":"ContainerDied","Data":"438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4"} Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.320454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rttt9" event={"ID":"0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1","Type":"ContainerDied","Data":"3aaaab26c7682198ae545299f3e5c79ffa03feb5eb5c2d17f832428f1f59b19a"} Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.320478 4749 scope.go:117] "RemoveContainer" containerID="438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.320641 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rttt9" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.359377 4749 scope.go:117] "RemoveContainer" containerID="9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.370719 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rttt9"] Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.384797 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rttt9"] Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.400660 4749 scope.go:117] "RemoveContainer" containerID="98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.458280 4749 scope.go:117] "RemoveContainer" containerID="438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4" Nov 29 03:45:26 crc kubenswrapper[4749]: E1129 03:45:26.459083 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4\": container with ID starting with 438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4 not found: ID does not exist" containerID="438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.459129 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4"} err="failed to get container status \"438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4\": rpc error: code = NotFound desc = could not find container \"438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4\": container with ID starting with 438c320d479d91022bb456f4d5bc00125dbea2c77ba7b98cd959a44db67f4cc4 not found: ID does not exist" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.459155 4749 scope.go:117] "RemoveContainer" containerID="9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef" Nov 29 03:45:26 crc kubenswrapper[4749]: E1129 03:45:26.459795 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef\": container with ID starting with 9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef not found: ID does not exist" containerID="9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.459833 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef"} err="failed to get container status \"9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef\": rpc error: code = NotFound desc = could not find container \"9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef\": container with ID starting with 9ea522117244139bd88ce4e99a3569d5d44865a87607ecc9098baf310dd6a4ef not found: ID does not exist" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.459858 4749 scope.go:117] "RemoveContainer" containerID="98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1" Nov 29 03:45:26 crc kubenswrapper[4749]: E1129 03:45:26.460455 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1\": container with ID starting with 98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1 not found: ID does not exist" containerID="98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1" Nov 29 03:45:26 crc kubenswrapper[4749]: I1129 03:45:26.460489 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1"} err="failed to get container status \"98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1\": rpc error: code = NotFound desc = could not find container \"98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1\": container with ID starting with 98d9a5dfddeef85fff3c421018a73364f53655158159f0a68d849701a7ec8ea1 not found: ID does not exist" Nov 29 03:45:27 crc kubenswrapper[4749]: I1129 03:45:27.097195 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" path="/var/lib/kubelet/pods/0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1/volumes" Nov 29 03:45:27 crc kubenswrapper[4749]: I1129 03:45:27.636508 4749 scope.go:117] "RemoveContainer" containerID="6d1ebeaf0b5202f7c74a8f46100c2a419b905acd371abbc810cf17114c8e4600" Nov 29 03:45:27 crc kubenswrapper[4749]: I1129 03:45:27.678117 4749 scope.go:117] "RemoveContainer" containerID="d7038e31c1fa80e2b96e53c64b240314b35770699df56a4e921544099d5becbd" Nov 29 03:45:27 crc kubenswrapper[4749]: I1129 03:45:27.809282 4749 scope.go:117] "RemoveContainer" containerID="cb7014033e9321cfee398cc2d67ba7aa8dea2b5b66b46079782ade323f696aa5" Nov 29 03:45:27 crc kubenswrapper[4749]: I1129 03:45:27.849693 4749 scope.go:117] "RemoveContainer" containerID="399534f9fb3d312ff1a21933e21bbc3c8d014f3a1b2920afca152db2b62f3b69" Nov 29 03:45:39 crc kubenswrapper[4749]: I1129 03:45:39.075120 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:45:39 crc kubenswrapper[4749]: E1129 03:45:39.075942 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:45:52 crc kubenswrapper[4749]: I1129 03:45:52.075420 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:45:52 crc kubenswrapper[4749]: E1129 03:45:52.076778 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:46:05 crc kubenswrapper[4749]: I1129 03:46:05.076078 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:46:05 crc kubenswrapper[4749]: E1129 03:46:05.077413 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:46:20 crc kubenswrapper[4749]: I1129 03:46:20.075876 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:46:20 crc kubenswrapper[4749]: E1129 03:46:20.077021 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:46:35 crc kubenswrapper[4749]: I1129 03:46:35.076719 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:46:35 crc kubenswrapper[4749]: E1129 03:46:35.078501 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:46:50 crc kubenswrapper[4749]: I1129 03:46:50.075658 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:46:50 crc kubenswrapper[4749]: E1129 03:46:50.076974 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:47:04 crc kubenswrapper[4749]: I1129 03:47:04.075029 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:47:04 crc kubenswrapper[4749]: E1129 03:47:04.075927 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:47:18 crc kubenswrapper[4749]: I1129 03:47:18.075557 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:47:18 crc kubenswrapper[4749]: E1129 03:47:18.076594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:47:31 crc kubenswrapper[4749]: I1129 03:47:31.075887 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:47:31 crc kubenswrapper[4749]: E1129 03:47:31.079012 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:47:45 crc kubenswrapper[4749]: I1129 03:47:45.075536 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:47:45 crc kubenswrapper[4749]: E1129 03:47:45.076385 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:47:58 crc kubenswrapper[4749]: I1129 03:47:58.076650 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:47:58 crc kubenswrapper[4749]: E1129 03:47:58.078655 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:48:11 crc kubenswrapper[4749]: I1129 03:48:11.075323 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:48:11 crc kubenswrapper[4749]: E1129 03:48:11.076267 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:48:26 crc kubenswrapper[4749]: I1129 03:48:26.075326 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:48:27 crc kubenswrapper[4749]: I1129 03:48:27.489230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"a78b2d7715ebb1d91e95d6ff31ad716259e43edf877244ecfaa2d1ab0d8d9cce"} Nov 29 03:49:14 crc kubenswrapper[4749]: I1129 03:49:14.088567 4749 generic.go:334] "Generic (PLEG): container finished" podID="c468b431-5762-4449-8467-64844ca96b2d" containerID="701dc8a2bba419840f64c8343a9ea67baea457bab4bcc6607df33f362124e4e4" exitCode=0 Nov 29 03:49:14 crc kubenswrapper[4749]: I1129 03:49:14.088703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" event={"ID":"c468b431-5762-4449-8467-64844ca96b2d","Type":"ContainerDied","Data":"701dc8a2bba419840f64c8343a9ea67baea457bab4bcc6607df33f362124e4e4"} Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.121789 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" event={"ID":"c468b431-5762-4449-8467-64844ca96b2d","Type":"ContainerDied","Data":"29756bb87e5b63e3dc54d27749f74ccbddef13cb7fb6c4f954709250a95ad433"} Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.122359 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29756bb87e5b63e3dc54d27749f74ccbddef13cb7fb6c4f954709250a95ad433" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.212322 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406041 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-1\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406127 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-0\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406313 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-1\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406402 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwqsk\" (UniqueName: \"kubernetes.io/projected/c468b431-5762-4449-8467-64844ca96b2d-kube-api-access-fwqsk\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ceph\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406609 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ssh-key\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406656 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-0\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-1\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.406747 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-combined-ca-bundle\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.407450 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-0\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.407496 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-inventory\") pod \"c468b431-5762-4449-8467-64844ca96b2d\" (UID: \"c468b431-5762-4449-8467-64844ca96b2d\") " Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.413359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ceph" (OuterVolumeSpecName: "ceph") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.416759 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.417423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c468b431-5762-4449-8467-64844ca96b2d-kube-api-access-fwqsk" (OuterVolumeSpecName: "kube-api-access-fwqsk") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "kube-api-access-fwqsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.452881 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.457029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.464280 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.464822 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.480009 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.483012 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-inventory" (OuterVolumeSpecName: "inventory") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.491607 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.495345 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c468b431-5762-4449-8467-64844ca96b2d" (UID: "c468b431-5762-4449-8467-64844ca96b2d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520401 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520809 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520823 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520837 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520852 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-inventory\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520864 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520875 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520889 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c468b431-5762-4449-8467-64844ca96b2d-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520902 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwqsk\" (UniqueName: \"kubernetes.io/projected/c468b431-5762-4449-8467-64844ca96b2d-kube-api-access-fwqsk\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520913 4749 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ceph\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:16 crc kubenswrapper[4749]: I1129 03:49:16.520925 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c468b431-5762-4449-8467-64844ca96b2d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:17 crc kubenswrapper[4749]: I1129 03:49:17.133751 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.625953 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b98zk"] Nov 29 03:49:31 crc kubenswrapper[4749]: E1129 03:49:31.633317 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c468b431-5762-4449-8467-64844ca96b2d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.633347 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c468b431-5762-4449-8467-64844ca96b2d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 29 03:49:31 crc kubenswrapper[4749]: E1129 03:49:31.633401 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerName="registry-server" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.633415 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerName="registry-server" Nov 29 03:49:31 crc kubenswrapper[4749]: E1129 03:49:31.633458 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerName="extract-utilities" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.633473 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerName="extract-utilities" Nov 29 03:49:31 crc kubenswrapper[4749]: E1129 03:49:31.633498 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerName="extract-content" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.633510 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerName="extract-content" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.633920 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c468b431-5762-4449-8467-64844ca96b2d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.633956 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9d9646-b5a2-4792-9ac9-7f57cc4d3ea1" containerName="registry-server" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.646316 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.660463 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b98zk"] Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.735066 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-catalog-content\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.735653 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvr7\" (UniqueName: \"kubernetes.io/projected/6f715404-3ab1-4750-8e1c-e17dc2149594-kube-api-access-jbvr7\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.735778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-utilities\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.838496 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvr7\" (UniqueName: \"kubernetes.io/projected/6f715404-3ab1-4750-8e1c-e17dc2149594-kube-api-access-jbvr7\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.838584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-utilities\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.838626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-catalog-content\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.841579 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-utilities\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.841650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-catalog-content\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:31 crc kubenswrapper[4749]: I1129 03:49:31.877029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvr7\" (UniqueName: \"kubernetes.io/projected/6f715404-3ab1-4750-8e1c-e17dc2149594-kube-api-access-jbvr7\") pod \"certified-operators-b98zk\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:32 crc kubenswrapper[4749]: I1129 03:49:32.005334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:32 crc kubenswrapper[4749]: I1129 03:49:32.631637 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b98zk"] Nov 29 03:49:33 crc kubenswrapper[4749]: I1129 03:49:33.390886 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerID="a8c4eaa288d18ed35744c975ecda8d95b248438433198c34c3eeb6cad279327a" exitCode=0 Nov 29 03:49:33 crc kubenswrapper[4749]: I1129 03:49:33.390971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b98zk" event={"ID":"6f715404-3ab1-4750-8e1c-e17dc2149594","Type":"ContainerDied","Data":"a8c4eaa288d18ed35744c975ecda8d95b248438433198c34c3eeb6cad279327a"} Nov 29 03:49:33 crc kubenswrapper[4749]: I1129 03:49:33.391286 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b98zk" event={"ID":"6f715404-3ab1-4750-8e1c-e17dc2149594","Type":"ContainerStarted","Data":"238054a7793c7d3dca63ece6752857f0d25b008a9870962502649d8d616d2964"} Nov 29 03:49:33 crc kubenswrapper[4749]: I1129 03:49:33.394323 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:49:34 crc kubenswrapper[4749]: I1129 03:49:34.404207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b98zk" event={"ID":"6f715404-3ab1-4750-8e1c-e17dc2149594","Type":"ContainerStarted","Data":"b74cf1b7357fe20b19afed0179f0f776d38f7968d4104c96b81e21420dc4ed9b"} Nov 29 03:49:35 crc kubenswrapper[4749]: I1129 03:49:35.424431 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerID="b74cf1b7357fe20b19afed0179f0f776d38f7968d4104c96b81e21420dc4ed9b" exitCode=0 Nov 29 03:49:35 crc kubenswrapper[4749]: I1129 03:49:35.424509 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b98zk" event={"ID":"6f715404-3ab1-4750-8e1c-e17dc2149594","Type":"ContainerDied","Data":"b74cf1b7357fe20b19afed0179f0f776d38f7968d4104c96b81e21420dc4ed9b"} Nov 29 03:49:36 crc kubenswrapper[4749]: I1129 03:49:36.457907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b98zk" event={"ID":"6f715404-3ab1-4750-8e1c-e17dc2149594","Type":"ContainerStarted","Data":"b41e7c23299238b8255443689a5df5fbad7de29bac82afa0ecbcf22b2907b3c6"} Nov 29 03:49:36 crc kubenswrapper[4749]: I1129 03:49:36.516720 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b98zk" podStartSLOduration=2.986111236 podStartE2EDuration="5.516690973s" podCreationTimestamp="2025-11-29 03:49:31 +0000 UTC" firstStartedPulling="2025-11-29 03:49:33.39377474 +0000 UTC m=+9516.565924637" lastFinishedPulling="2025-11-29 03:49:35.924354477 +0000 UTC m=+9519.096504374" observedRunningTime="2025-11-29 03:49:36.494336598 +0000 UTC m=+9519.666486485" watchObservedRunningTime="2025-11-29 03:49:36.516690973 +0000 UTC m=+9519.688840870" Nov 29 03:49:38 crc kubenswrapper[4749]: E1129 03:49:38.841493 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:34636->38.102.83.30:35737: write tcp 38.102.83.30:34636->38.102.83.30:35737: write: broken pipe Nov 29 03:49:42 crc kubenswrapper[4749]: I1129 03:49:42.006576 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:42 crc kubenswrapper[4749]: I1129 03:49:42.006998 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:42 crc kubenswrapper[4749]: I1129 03:49:42.079886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:42 crc kubenswrapper[4749]: I1129 03:49:42.629548 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:42 crc kubenswrapper[4749]: I1129 03:49:42.715093 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b98zk"] Nov 29 03:49:45 crc kubenswrapper[4749]: I1129 03:49:45.218934 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b98zk" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerName="registry-server" containerID="cri-o://b41e7c23299238b8255443689a5df5fbad7de29bac82afa0ecbcf22b2907b3c6" gracePeriod=2 Nov 29 03:49:46 crc kubenswrapper[4749]: I1129 03:49:46.235317 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerID="b41e7c23299238b8255443689a5df5fbad7de29bac82afa0ecbcf22b2907b3c6" exitCode=0 Nov 29 03:49:46 crc kubenswrapper[4749]: I1129 03:49:46.235419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b98zk" event={"ID":"6f715404-3ab1-4750-8e1c-e17dc2149594","Type":"ContainerDied","Data":"b41e7c23299238b8255443689a5df5fbad7de29bac82afa0ecbcf22b2907b3c6"} Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.077813 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.215092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbvr7\" (UniqueName: \"kubernetes.io/projected/6f715404-3ab1-4750-8e1c-e17dc2149594-kube-api-access-jbvr7\") pod \"6f715404-3ab1-4750-8e1c-e17dc2149594\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.215328 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-catalog-content\") pod \"6f715404-3ab1-4750-8e1c-e17dc2149594\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.215406 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-utilities\") pod \"6f715404-3ab1-4750-8e1c-e17dc2149594\" (UID: \"6f715404-3ab1-4750-8e1c-e17dc2149594\") " Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.217942 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-utilities" (OuterVolumeSpecName: "utilities") pod "6f715404-3ab1-4750-8e1c-e17dc2149594" (UID: "6f715404-3ab1-4750-8e1c-e17dc2149594"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.260051 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b98zk" event={"ID":"6f715404-3ab1-4750-8e1c-e17dc2149594","Type":"ContainerDied","Data":"238054a7793c7d3dca63ece6752857f0d25b008a9870962502649d8d616d2964"} Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.260142 4749 scope.go:117] "RemoveContainer" containerID="b41e7c23299238b8255443689a5df5fbad7de29bac82afa0ecbcf22b2907b3c6" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.260149 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b98zk" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.299038 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f715404-3ab1-4750-8e1c-e17dc2149594-kube-api-access-jbvr7" (OuterVolumeSpecName: "kube-api-access-jbvr7") pod "6f715404-3ab1-4750-8e1c-e17dc2149594" (UID: "6f715404-3ab1-4750-8e1c-e17dc2149594"). InnerVolumeSpecName "kube-api-access-jbvr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.311141 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f715404-3ab1-4750-8e1c-e17dc2149594" (UID: "6f715404-3ab1-4750-8e1c-e17dc2149594"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.318237 4749 scope.go:117] "RemoveContainer" containerID="b74cf1b7357fe20b19afed0179f0f776d38f7968d4104c96b81e21420dc4ed9b" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.319026 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbvr7\" (UniqueName: \"kubernetes.io/projected/6f715404-3ab1-4750-8e1c-e17dc2149594-kube-api-access-jbvr7\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.319074 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.319092 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f715404-3ab1-4750-8e1c-e17dc2149594-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.362591 4749 scope.go:117] "RemoveContainer" containerID="a8c4eaa288d18ed35744c975ecda8d95b248438433198c34c3eeb6cad279327a" Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.607960 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b98zk"] Nov 29 03:49:47 crc kubenswrapper[4749]: I1129 03:49:47.617075 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b98zk"] Nov 29 03:49:49 crc kubenswrapper[4749]: I1129 03:49:49.092984 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" path="/var/lib/kubelet/pods/6f715404-3ab1-4750-8e1c-e17dc2149594/volumes" Nov 29 03:50:55 crc kubenswrapper[4749]: I1129 03:50:55.374071 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:50:55 crc kubenswrapper[4749]: I1129 03:50:55.374737 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:51:25 crc kubenswrapper[4749]: I1129 03:51:25.374868 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:51:25 crc kubenswrapper[4749]: I1129 03:51:25.375440 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:51:32 crc kubenswrapper[4749]: I1129 03:51:32.117888 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Nov 29 03:51:32 crc kubenswrapper[4749]: I1129 03:51:32.119261 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="0dea6975-e5ed-4016-b418-88a79f113bd4" containerName="adoption" containerID="cri-o://71226f11aa10a347fc1f88c1de493850783c83ebc6f7eacd0784c6b4dc6a1aa8" gracePeriod=30 Nov 29 03:51:55 crc kubenswrapper[4749]: I1129 03:51:55.374285 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:51:55 crc kubenswrapper[4749]: I1129 03:51:55.374938 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:51:55 crc kubenswrapper[4749]: I1129 03:51:55.375000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:51:55 crc kubenswrapper[4749]: I1129 03:51:55.376073 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a78b2d7715ebb1d91e95d6ff31ad716259e43edf877244ecfaa2d1ab0d8d9cce"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:51:55 crc kubenswrapper[4749]: I1129 03:51:55.376168 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://a78b2d7715ebb1d91e95d6ff31ad716259e43edf877244ecfaa2d1ab0d8d9cce" gracePeriod=600 Nov 29 03:51:56 crc kubenswrapper[4749]: I1129 03:51:56.142071 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="a78b2d7715ebb1d91e95d6ff31ad716259e43edf877244ecfaa2d1ab0d8d9cce" exitCode=0 Nov 29 03:51:56 crc kubenswrapper[4749]: I1129 03:51:56.142153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"a78b2d7715ebb1d91e95d6ff31ad716259e43edf877244ecfaa2d1ab0d8d9cce"} Nov 29 03:51:56 crc kubenswrapper[4749]: I1129 03:51:56.142462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f"} Nov 29 03:51:56 crc kubenswrapper[4749]: I1129 03:51:56.142511 4749 scope.go:117] "RemoveContainer" containerID="f5fa17ccfdde2959c5918b3a40fdbd96a0c7809fb396bb8e394f72d2bcff8fea" Nov 29 03:52:02 crc kubenswrapper[4749]: I1129 03:52:02.239523 4749 generic.go:334] "Generic (PLEG): container finished" podID="0dea6975-e5ed-4016-b418-88a79f113bd4" containerID="71226f11aa10a347fc1f88c1de493850783c83ebc6f7eacd0784c6b4dc6a1aa8" exitCode=137 Nov 29 03:52:02 crc kubenswrapper[4749]: I1129 03:52:02.239655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0dea6975-e5ed-4016-b418-88a79f113bd4","Type":"ContainerDied","Data":"71226f11aa10a347fc1f88c1de493850783c83ebc6f7eacd0784c6b4dc6a1aa8"} Nov 29 03:52:02 crc kubenswrapper[4749]: I1129 03:52:02.817672 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 29 03:52:02 crc kubenswrapper[4749]: I1129 03:52:02.937767 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xllz7\" (UniqueName: \"kubernetes.io/projected/0dea6975-e5ed-4016-b418-88a79f113bd4-kube-api-access-xllz7\") pod \"0dea6975-e5ed-4016-b418-88a79f113bd4\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") " Nov 29 03:52:02 crc kubenswrapper[4749]: I1129 03:52:02.938369 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\") pod \"0dea6975-e5ed-4016-b418-88a79f113bd4\" (UID: \"0dea6975-e5ed-4016-b418-88a79f113bd4\") " Nov 29 03:52:02 crc kubenswrapper[4749]: I1129 03:52:02.956148 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dea6975-e5ed-4016-b418-88a79f113bd4-kube-api-access-xllz7" (OuterVolumeSpecName: "kube-api-access-xllz7") pod "0dea6975-e5ed-4016-b418-88a79f113bd4" (UID: "0dea6975-e5ed-4016-b418-88a79f113bd4"). InnerVolumeSpecName "kube-api-access-xllz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:52:02 crc kubenswrapper[4749]: I1129 03:52:02.960508 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7" (OuterVolumeSpecName: "mariadb-data") pod "0dea6975-e5ed-4016-b418-88a79f113bd4" (UID: "0dea6975-e5ed-4016-b418-88a79f113bd4"). InnerVolumeSpecName "pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.041321 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xllz7\" (UniqueName: \"kubernetes.io/projected/0dea6975-e5ed-4016-b418-88a79f113bd4-kube-api-access-xllz7\") on node \"crc\" DevicePath \"\"" Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.041387 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\") on node \"crc\" " Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.068145 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.068524 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7") on node "crc" Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.143702 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6b90f073-9ad0-4f24-9af2-0c69cc4a72a7\") on node \"crc\" DevicePath \"\"" Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.254267 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"0dea6975-e5ed-4016-b418-88a79f113bd4","Type":"ContainerDied","Data":"e1c43d34fb020911a8f49fe71f11e289341cc546519bdfb9679d2f13ca84ee01"} Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.254314 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.254348 4749 scope.go:117] "RemoveContainer" containerID="71226f11aa10a347fc1f88c1de493850783c83ebc6f7eacd0784c6b4dc6a1aa8" Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.295683 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Nov 29 03:52:03 crc kubenswrapper[4749]: I1129 03:52:03.310120 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Nov 29 03:52:04 crc kubenswrapper[4749]: I1129 03:52:04.133504 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Nov 29 03:52:04 crc kubenswrapper[4749]: I1129 03:52:04.134244 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="5e1acf21-8f1e-49f7-83fe-7a6fe053a823" containerName="adoption" containerID="cri-o://50a096985df077833a10cea58e399e10a213947b479dd0a553b96283b1671ed8" gracePeriod=30 Nov 29 03:52:05 crc kubenswrapper[4749]: I1129 03:52:05.092363 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dea6975-e5ed-4016-b418-88a79f113bd4" path="/var/lib/kubelet/pods/0dea6975-e5ed-4016-b418-88a79f113bd4/volumes" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.646721 4749 generic.go:334] "Generic (PLEG): container finished" podID="5e1acf21-8f1e-49f7-83fe-7a6fe053a823" containerID="50a096985df077833a10cea58e399e10a213947b479dd0a553b96283b1671ed8" exitCode=137 Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.646807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5e1acf21-8f1e-49f7-83fe-7a6fe053a823","Type":"ContainerDied","Data":"50a096985df077833a10cea58e399e10a213947b479dd0a553b96283b1671ed8"} Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.647575 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5e1acf21-8f1e-49f7-83fe-7a6fe053a823","Type":"ContainerDied","Data":"28f547c5689925416428c98c92ccc6a2b895472a0c581eb06362eff66b28540f"} Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.647612 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28f547c5689925416428c98c92ccc6a2b895472a0c581eb06362eff66b28540f" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.657999 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.726094 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\") pod \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.726236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfmbv\" (UniqueName: \"kubernetes.io/projected/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-kube-api-access-jfmbv\") pod \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.726628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-ovn-data-cert\") pod \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\" (UID: \"5e1acf21-8f1e-49f7-83fe-7a6fe053a823\") " Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.732370 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "5e1acf21-8f1e-49f7-83fe-7a6fe053a823" (UID: "5e1acf21-8f1e-49f7-83fe-7a6fe053a823"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.737576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-kube-api-access-jfmbv" (OuterVolumeSpecName: "kube-api-access-jfmbv") pod "5e1acf21-8f1e-49f7-83fe-7a6fe053a823" (UID: "5e1acf21-8f1e-49f7-83fe-7a6fe053a823"). InnerVolumeSpecName "kube-api-access-jfmbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.751254 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d" (OuterVolumeSpecName: "ovn-data") pod "5e1acf21-8f1e-49f7-83fe-7a6fe053a823" (UID: "5e1acf21-8f1e-49f7-83fe-7a6fe053a823"). InnerVolumeSpecName "pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.832672 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.832747 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\") on node \"crc\" " Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.832761 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfmbv\" (UniqueName: \"kubernetes.io/projected/5e1acf21-8f1e-49f7-83fe-7a6fe053a823-kube-api-access-jfmbv\") on node \"crc\" DevicePath \"\"" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.857592 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.857904 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d") on node "crc" Nov 29 03:52:34 crc kubenswrapper[4749]: I1129 03:52:34.935139 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ca1b7e0-1e88-4215-a7db-9351a09cd69d\") on node \"crc\" DevicePath \"\"" Nov 29 03:52:35 crc kubenswrapper[4749]: I1129 03:52:35.657997 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 29 03:52:35 crc kubenswrapper[4749]: I1129 03:52:35.690064 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Nov 29 03:52:35 crc kubenswrapper[4749]: I1129 03:52:35.699685 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Nov 29 03:52:37 crc kubenswrapper[4749]: I1129 03:52:37.086488 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1acf21-8f1e-49f7-83fe-7a6fe053a823" path="/var/lib/kubelet/pods/5e1acf21-8f1e-49f7-83fe-7a6fe053a823/volumes" Nov 29 03:53:28 crc kubenswrapper[4749]: I1129 03:53:28.136263 4749 scope.go:117] "RemoveContainer" containerID="50a096985df077833a10cea58e399e10a213947b479dd0a553b96283b1671ed8" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.233695 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hlwsx/must-gather-bmrnf"] Nov 29 03:53:33 crc kubenswrapper[4749]: E1129 03:53:33.234910 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1acf21-8f1e-49f7-83fe-7a6fe053a823" containerName="adoption" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.234929 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1acf21-8f1e-49f7-83fe-7a6fe053a823" containerName="adoption" Nov 29 03:53:33 crc kubenswrapper[4749]: E1129 03:53:33.234941 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dea6975-e5ed-4016-b418-88a79f113bd4" containerName="adoption" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.234949 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dea6975-e5ed-4016-b418-88a79f113bd4" containerName="adoption" Nov 29 03:53:33 crc kubenswrapper[4749]: E1129 03:53:33.234996 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerName="registry-server" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.235020 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerName="registry-server" Nov 29 03:53:33 crc kubenswrapper[4749]: E1129 03:53:33.235038 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerName="extract-content" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.235046 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerName="extract-content" Nov 29 03:53:33 crc kubenswrapper[4749]: E1129 03:53:33.235079 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerName="extract-utilities" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.235088 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerName="extract-utilities" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.235365 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f715404-3ab1-4750-8e1c-e17dc2149594" containerName="registry-server" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.235397 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dea6975-e5ed-4016-b418-88a79f113bd4" containerName="adoption" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.235417 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1acf21-8f1e-49f7-83fe-7a6fe053a823" containerName="adoption" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.236933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.239011 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hlwsx"/"openshift-service-ca.crt" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.239479 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hlwsx"/"kube-root-ca.crt" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.244992 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hlwsx/must-gather-bmrnf"] Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.246089 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hlwsx"/"default-dockercfg-qfj9x" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.400699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a97b28cd-23db-4aac-91de-8d4008cb0384-must-gather-output\") pod \"must-gather-bmrnf\" (UID: \"a97b28cd-23db-4aac-91de-8d4008cb0384\") " pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.400999 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhkg\" (UniqueName: \"kubernetes.io/projected/a97b28cd-23db-4aac-91de-8d4008cb0384-kube-api-access-qdhkg\") pod \"must-gather-bmrnf\" (UID: \"a97b28cd-23db-4aac-91de-8d4008cb0384\") " pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.502753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a97b28cd-23db-4aac-91de-8d4008cb0384-must-gather-output\") pod \"must-gather-bmrnf\" (UID: \"a97b28cd-23db-4aac-91de-8d4008cb0384\") " pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.502839 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhkg\" (UniqueName: \"kubernetes.io/projected/a97b28cd-23db-4aac-91de-8d4008cb0384-kube-api-access-qdhkg\") pod \"must-gather-bmrnf\" (UID: \"a97b28cd-23db-4aac-91de-8d4008cb0384\") " pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.503345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a97b28cd-23db-4aac-91de-8d4008cb0384-must-gather-output\") pod \"must-gather-bmrnf\" (UID: \"a97b28cd-23db-4aac-91de-8d4008cb0384\") " pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.520812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhkg\" (UniqueName: \"kubernetes.io/projected/a97b28cd-23db-4aac-91de-8d4008cb0384-kube-api-access-qdhkg\") pod \"must-gather-bmrnf\" (UID: \"a97b28cd-23db-4aac-91de-8d4008cb0384\") " pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 03:53:33 crc kubenswrapper[4749]: I1129 03:53:33.564242 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 03:53:34 crc kubenswrapper[4749]: I1129 03:53:34.117671 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hlwsx/must-gather-bmrnf"] Nov 29 03:53:34 crc kubenswrapper[4749]: W1129 03:53:34.146542 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97b28cd_23db_4aac_91de_8d4008cb0384.slice/crio-0df236aa1d1f0959eab4c3f08c7cb2c166a70160ca4860aba6f10c68ad7feea6 WatchSource:0}: Error finding container 0df236aa1d1f0959eab4c3f08c7cb2c166a70160ca4860aba6f10c68ad7feea6: Status 404 returned error can't find the container with id 0df236aa1d1f0959eab4c3f08c7cb2c166a70160ca4860aba6f10c68ad7feea6 Nov 29 03:53:34 crc kubenswrapper[4749]: I1129 03:53:34.871666 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" event={"ID":"a97b28cd-23db-4aac-91de-8d4008cb0384","Type":"ContainerStarted","Data":"0df236aa1d1f0959eab4c3f08c7cb2c166a70160ca4860aba6f10c68ad7feea6"} Nov 29 03:53:38 crc kubenswrapper[4749]: I1129 03:53:38.919460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" event={"ID":"a97b28cd-23db-4aac-91de-8d4008cb0384","Type":"ContainerStarted","Data":"070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847"} Nov 29 03:53:38 crc kubenswrapper[4749]: I1129 03:53:38.919930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" event={"ID":"a97b28cd-23db-4aac-91de-8d4008cb0384","Type":"ContainerStarted","Data":"6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9"} Nov 29 03:53:39 crc kubenswrapper[4749]: I1129 03:53:39.953960 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" podStartSLOduration=3.213522029 podStartE2EDuration="6.953928438s" podCreationTimestamp="2025-11-29 03:53:33 +0000 UTC" firstStartedPulling="2025-11-29 03:53:34.149551214 +0000 UTC m=+9757.321701071" lastFinishedPulling="2025-11-29 03:53:37.889957583 +0000 UTC m=+9761.062107480" observedRunningTime="2025-11-29 03:53:39.946031715 +0000 UTC m=+9763.118181602" watchObservedRunningTime="2025-11-29 03:53:39.953928438 +0000 UTC m=+9763.126078325" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.022921 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hlwsx/crc-debug-rr2ms"] Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.025560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.209171 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvg7r\" (UniqueName: \"kubernetes.io/projected/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-kube-api-access-qvg7r\") pod \"crc-debug-rr2ms\" (UID: \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\") " pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.209631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-host\") pod \"crc-debug-rr2ms\" (UID: \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\") " pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.311387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-host\") pod \"crc-debug-rr2ms\" (UID: \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\") " pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.311574 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-host\") pod \"crc-debug-rr2ms\" (UID: \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\") " pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.311590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvg7r\" (UniqueName: \"kubernetes.io/projected/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-kube-api-access-qvg7r\") pod \"crc-debug-rr2ms\" (UID: \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\") " pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.338137 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvg7r\" (UniqueName: \"kubernetes.io/projected/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-kube-api-access-qvg7r\") pod \"crc-debug-rr2ms\" (UID: \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\") " pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.347039 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:53:44 crc kubenswrapper[4749]: I1129 03:53:44.990068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" event={"ID":"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9","Type":"ContainerStarted","Data":"15c9f5ce22d51f8d58ccc8f46b38fb4a5a0af132822c7861b8d78eff5f07cd40"} Nov 29 03:53:55 crc kubenswrapper[4749]: I1129 03:53:55.374582 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:53:55 crc kubenswrapper[4749]: I1129 03:53:55.375102 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:53:59 crc kubenswrapper[4749]: E1129 03:53:59.764175 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Nov 29 03:53:59 crc kubenswrapper[4749]: E1129 03:53:59.764935 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvg7r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-rr2ms_openshift-must-gather-hlwsx(464c127e-cc88-4b2d-ac42-cbd7e0ae52f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 29 03:53:59 crc kubenswrapper[4749]: E1129 03:53:59.766186 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" podUID="464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" Nov 29 03:54:00 crc kubenswrapper[4749]: E1129 03:54:00.159051 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" podUID="464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" Nov 29 03:54:14 crc kubenswrapper[4749]: I1129 03:54:14.307384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" event={"ID":"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9","Type":"ContainerStarted","Data":"abc38d90efd2f4edef4a9e600540e4022c8e43e6f7d031e7fa59fa0c2c8ffc5f"} Nov 29 03:54:14 crc kubenswrapper[4749]: I1129 03:54:14.331981 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" podStartSLOduration=1.127220771 podStartE2EDuration="30.331960408s" podCreationTimestamp="2025-11-29 03:53:44 +0000 UTC" firstStartedPulling="2025-11-29 03:53:44.380115457 +0000 UTC m=+9767.552265314" lastFinishedPulling="2025-11-29 03:54:13.584855094 +0000 UTC m=+9796.757004951" observedRunningTime="2025-11-29 03:54:14.323103172 +0000 UTC m=+9797.495253039" watchObservedRunningTime="2025-11-29 03:54:14.331960408 +0000 UTC m=+9797.504110285" Nov 29 03:54:25 crc kubenswrapper[4749]: I1129 03:54:25.374625 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:54:25 crc kubenswrapper[4749]: I1129 03:54:25.375285 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:54:31 crc kubenswrapper[4749]: I1129 03:54:31.475740 4749 generic.go:334] "Generic (PLEG): container finished" podID="464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" containerID="abc38d90efd2f4edef4a9e600540e4022c8e43e6f7d031e7fa59fa0c2c8ffc5f" exitCode=0 Nov 29 03:54:31 crc kubenswrapper[4749]: I1129 03:54:31.476122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" event={"ID":"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9","Type":"ContainerDied","Data":"abc38d90efd2f4edef4a9e600540e4022c8e43e6f7d031e7fa59fa0c2c8ffc5f"} Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.644761 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.688212 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hlwsx/crc-debug-rr2ms"] Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.699320 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hlwsx/crc-debug-rr2ms"] Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.736631 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvg7r\" (UniqueName: \"kubernetes.io/projected/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-kube-api-access-qvg7r\") pod \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\" (UID: \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\") " Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.736981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-host\") pod \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\" (UID: \"464c127e-cc88-4b2d-ac42-cbd7e0ae52f9\") " Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.737182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-host" (OuterVolumeSpecName: "host") pod "464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" (UID: "464c127e-cc88-4b2d-ac42-cbd7e0ae52f9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.737594 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-host\") on node \"crc\" DevicePath \"\"" Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.743466 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-kube-api-access-qvg7r" (OuterVolumeSpecName: "kube-api-access-qvg7r") pod "464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" (UID: "464c127e-cc88-4b2d-ac42-cbd7e0ae52f9"). InnerVolumeSpecName "kube-api-access-qvg7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:54:32 crc kubenswrapper[4749]: I1129 03:54:32.843221 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvg7r\" (UniqueName: \"kubernetes.io/projected/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9-kube-api-access-qvg7r\") on node \"crc\" DevicePath \"\"" Nov 29 03:54:33 crc kubenswrapper[4749]: I1129 03:54:33.087646 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" path="/var/lib/kubelet/pods/464c127e-cc88-4b2d-ac42-cbd7e0ae52f9/volumes" Nov 29 03:54:33 crc kubenswrapper[4749]: I1129 03:54:33.516396 4749 scope.go:117] "RemoveContainer" containerID="abc38d90efd2f4edef4a9e600540e4022c8e43e6f7d031e7fa59fa0c2c8ffc5f" Nov 29 03:54:33 crc kubenswrapper[4749]: I1129 03:54:33.516476 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/crc-debug-rr2ms" Nov 29 03:54:33 crc kubenswrapper[4749]: I1129 03:54:33.910695 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hlwsx/crc-debug-lsvwj"] Nov 29 03:54:33 crc kubenswrapper[4749]: E1129 03:54:33.911342 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" containerName="container-00" Nov 29 03:54:33 crc kubenswrapper[4749]: I1129 03:54:33.911355 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" containerName="container-00" Nov 29 03:54:33 crc kubenswrapper[4749]: I1129 03:54:33.911575 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="464c127e-cc88-4b2d-ac42-cbd7e0ae52f9" containerName="container-00" Nov 29 03:54:33 crc kubenswrapper[4749]: I1129 03:54:33.912300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:34 crc kubenswrapper[4749]: I1129 03:54:34.083121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa8f362c-e447-4bb0-b04b-13abefba9933-host\") pod \"crc-debug-lsvwj\" (UID: \"aa8f362c-e447-4bb0-b04b-13abefba9933\") " pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:34 crc kubenswrapper[4749]: I1129 03:54:34.083610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h6vb\" (UniqueName: \"kubernetes.io/projected/aa8f362c-e447-4bb0-b04b-13abefba9933-kube-api-access-4h6vb\") pod \"crc-debug-lsvwj\" (UID: \"aa8f362c-e447-4bb0-b04b-13abefba9933\") " pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:34 crc kubenswrapper[4749]: I1129 03:54:34.186034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h6vb\" (UniqueName: \"kubernetes.io/projected/aa8f362c-e447-4bb0-b04b-13abefba9933-kube-api-access-4h6vb\") pod \"crc-debug-lsvwj\" (UID: \"aa8f362c-e447-4bb0-b04b-13abefba9933\") " pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:34 crc kubenswrapper[4749]: I1129 03:54:34.186138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa8f362c-e447-4bb0-b04b-13abefba9933-host\") pod \"crc-debug-lsvwj\" (UID: \"aa8f362c-e447-4bb0-b04b-13abefba9933\") " pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:34 crc kubenswrapper[4749]: I1129 03:54:34.186289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa8f362c-e447-4bb0-b04b-13abefba9933-host\") pod \"crc-debug-lsvwj\" (UID: \"aa8f362c-e447-4bb0-b04b-13abefba9933\") " pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:34 crc kubenswrapper[4749]: I1129 03:54:34.205034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h6vb\" (UniqueName: \"kubernetes.io/projected/aa8f362c-e447-4bb0-b04b-13abefba9933-kube-api-access-4h6vb\") pod \"crc-debug-lsvwj\" (UID: \"aa8f362c-e447-4bb0-b04b-13abefba9933\") " pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:34 crc kubenswrapper[4749]: I1129 03:54:34.229148 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:34 crc kubenswrapper[4749]: I1129 03:54:34.531033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" event={"ID":"aa8f362c-e447-4bb0-b04b-13abefba9933","Type":"ContainerStarted","Data":"4c093e763f484d1dd71f27de8c0c3e2fc09d1413261c89b955a56817422a9ba0"} Nov 29 03:54:35 crc kubenswrapper[4749]: I1129 03:54:35.545417 4749 generic.go:334] "Generic (PLEG): container finished" podID="aa8f362c-e447-4bb0-b04b-13abefba9933" containerID="9fb66f0fa986979bd51955886325d6cc5c6148f10884b8ddcf800876ea637ce0" exitCode=1 Nov 29 03:54:35 crc kubenswrapper[4749]: I1129 03:54:35.545502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" event={"ID":"aa8f362c-e447-4bb0-b04b-13abefba9933","Type":"ContainerDied","Data":"9fb66f0fa986979bd51955886325d6cc5c6148f10884b8ddcf800876ea637ce0"} Nov 29 03:54:35 crc kubenswrapper[4749]: I1129 03:54:35.597538 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hlwsx/crc-debug-lsvwj"] Nov 29 03:54:35 crc kubenswrapper[4749]: I1129 03:54:35.607368 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hlwsx/crc-debug-lsvwj"] Nov 29 03:54:36 crc kubenswrapper[4749]: I1129 03:54:36.690056 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:36 crc kubenswrapper[4749]: I1129 03:54:36.847878 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa8f362c-e447-4bb0-b04b-13abefba9933-host\") pod \"aa8f362c-e447-4bb0-b04b-13abefba9933\" (UID: \"aa8f362c-e447-4bb0-b04b-13abefba9933\") " Nov 29 03:54:36 crc kubenswrapper[4749]: I1129 03:54:36.847962 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa8f362c-e447-4bb0-b04b-13abefba9933-host" (OuterVolumeSpecName: "host") pod "aa8f362c-e447-4bb0-b04b-13abefba9933" (UID: "aa8f362c-e447-4bb0-b04b-13abefba9933"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 29 03:54:36 crc kubenswrapper[4749]: I1129 03:54:36.848069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h6vb\" (UniqueName: \"kubernetes.io/projected/aa8f362c-e447-4bb0-b04b-13abefba9933-kube-api-access-4h6vb\") pod \"aa8f362c-e447-4bb0-b04b-13abefba9933\" (UID: \"aa8f362c-e447-4bb0-b04b-13abefba9933\") " Nov 29 03:54:36 crc kubenswrapper[4749]: I1129 03:54:36.848824 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa8f362c-e447-4bb0-b04b-13abefba9933-host\") on node \"crc\" DevicePath \"\"" Nov 29 03:54:36 crc kubenswrapper[4749]: I1129 03:54:36.862527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8f362c-e447-4bb0-b04b-13abefba9933-kube-api-access-4h6vb" (OuterVolumeSpecName: "kube-api-access-4h6vb") pod "aa8f362c-e447-4bb0-b04b-13abefba9933" (UID: "aa8f362c-e447-4bb0-b04b-13abefba9933"). InnerVolumeSpecName "kube-api-access-4h6vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 03:54:36 crc kubenswrapper[4749]: I1129 03:54:36.951696 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h6vb\" (UniqueName: \"kubernetes.io/projected/aa8f362c-e447-4bb0-b04b-13abefba9933-kube-api-access-4h6vb\") on node \"crc\" DevicePath \"\"" Nov 29 03:54:37 crc kubenswrapper[4749]: I1129 03:54:37.107148 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa8f362c-e447-4bb0-b04b-13abefba9933" path="/var/lib/kubelet/pods/aa8f362c-e447-4bb0-b04b-13abefba9933/volumes" Nov 29 03:54:37 crc kubenswrapper[4749]: I1129 03:54:37.572021 4749 scope.go:117] "RemoveContainer" containerID="9fb66f0fa986979bd51955886325d6cc5c6148f10884b8ddcf800876ea637ce0" Nov 29 03:54:37 crc kubenswrapper[4749]: I1129 03:54:37.572235 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/crc-debug-lsvwj" Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.373910 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.374575 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.374634 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.375571 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.375644 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" gracePeriod=600 Nov 29 03:54:55 crc kubenswrapper[4749]: E1129 03:54:55.510109 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.799416 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" exitCode=0 Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.799643 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f"} Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.799855 4749 scope.go:117] "RemoveContainer" containerID="a78b2d7715ebb1d91e95d6ff31ad716259e43edf877244ecfaa2d1ab0d8d9cce" Nov 29 03:54:55 crc kubenswrapper[4749]: I1129 03:54:55.801557 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:54:55 crc kubenswrapper[4749]: E1129 03:54:55.802087 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:55:06 crc kubenswrapper[4749]: I1129 03:55:06.075424 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:55:06 crc kubenswrapper[4749]: E1129 03:55:06.077601 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:55:21 crc kubenswrapper[4749]: I1129 03:55:21.075735 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:55:21 crc kubenswrapper[4749]: E1129 03:55:21.076799 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:55:34 crc kubenswrapper[4749]: I1129 03:55:34.075083 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:55:34 crc kubenswrapper[4749]: E1129 03:55:34.077605 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:55:47 crc kubenswrapper[4749]: I1129 03:55:47.098102 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:55:47 crc kubenswrapper[4749]: E1129 03:55:47.099423 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:56:02 crc kubenswrapper[4749]: I1129 03:56:02.075369 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:56:02 crc kubenswrapper[4749]: E1129 03:56:02.076100 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:56:15 crc kubenswrapper[4749]: I1129 03:56:15.076125 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:56:15 crc kubenswrapper[4749]: E1129 03:56:15.077523 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:56:28 crc kubenswrapper[4749]: I1129 03:56:28.077295 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:56:28 crc kubenswrapper[4749]: E1129 03:56:28.078310 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:56:39 crc kubenswrapper[4749]: I1129 03:56:39.076587 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:56:39 crc kubenswrapper[4749]: E1129 03:56:39.077546 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:56:52 crc kubenswrapper[4749]: I1129 03:56:52.074714 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:56:52 crc kubenswrapper[4749]: E1129 03:56:52.075540 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:57:07 crc kubenswrapper[4749]: I1129 03:57:07.084586 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:57:07 crc kubenswrapper[4749]: E1129 03:57:07.085349 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:57:19 crc kubenswrapper[4749]: I1129 03:57:19.075908 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:57:19 crc kubenswrapper[4749]: E1129 03:57:19.077095 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:57:34 crc kubenswrapper[4749]: I1129 03:57:34.074929 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:57:34 crc kubenswrapper[4749]: E1129 03:57:34.075550 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:57:48 crc kubenswrapper[4749]: I1129 03:57:48.077917 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:57:48 crc kubenswrapper[4749]: E1129 03:57:48.079109 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:57:50 crc kubenswrapper[4749]: I1129 03:57:50.217960 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_44380984-9564-4578-8a96-024ee3db589a/init-config-reloader/0.log" Nov 29 03:57:50 crc kubenswrapper[4749]: I1129 03:57:50.449130 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_44380984-9564-4578-8a96-024ee3db589a/init-config-reloader/0.log" Nov 29 03:57:50 crc kubenswrapper[4749]: I1129 03:57:50.531926 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_44380984-9564-4578-8a96-024ee3db589a/config-reloader/0.log" Nov 29 03:57:50 crc kubenswrapper[4749]: I1129 03:57:50.698027 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_44380984-9564-4578-8a96-024ee3db589a/alertmanager/0.log" Nov 29 03:57:50 crc kubenswrapper[4749]: I1129 03:57:50.731536 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c4f049eb-f374-4140-b694-2af94e54001e/aodh-api/0.log" Nov 29 03:57:50 crc kubenswrapper[4749]: I1129 03:57:50.763569 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c4f049eb-f374-4140-b694-2af94e54001e/aodh-evaluator/0.log" Nov 29 03:57:50 crc kubenswrapper[4749]: I1129 03:57:50.888440 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c4f049eb-f374-4140-b694-2af94e54001e/aodh-listener/0.log" Nov 29 03:57:50 crc kubenswrapper[4749]: I1129 03:57:50.982353 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_c4f049eb-f374-4140-b694-2af94e54001e/aodh-notifier/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.011668 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55dbb95c78-qbs5z_34b6e3bf-224f-4796-844b-7b720cd27e67/barbican-api/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.106438 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55dbb95c78-qbs5z_34b6e3bf-224f-4796-844b-7b720cd27e67/barbican-api-log/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.241285 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-694fccbdcd-wbj66_c386cbf4-8348-49c7-b1d0-35e519fe20e6/barbican-keystone-listener/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.280493 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-694fccbdcd-wbj66_c386cbf4-8348-49c7-b1d0-35e519fe20e6/barbican-keystone-listener-log/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.466684 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f6ddb5fdc-v9j2j_2c5e7bf3-d98e-4d90-8fad-c71017fa20c4/barbican-worker-log/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.487672 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5f6ddb5fdc-v9j2j_2c5e7bf3-d98e-4d90-8fad-c71017fa20c4/barbican-worker/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.668926 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-d2xwc_aaf7982b-ed1d-4ff8-8f7a-43edefe2c6d6/bootstrap-openstack-openstack-cell1/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.679837 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6b84c17-5e5b-4464-9890-31bb49853d6d/ceilometer-central-agent/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.812986 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6b84c17-5e5b-4464-9890-31bb49853d6d/ceilometer-notification-agent/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.917578 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6b84c17-5e5b-4464-9890-31bb49853d6d/proxy-httpd/0.log" Nov 29 03:57:51 crc kubenswrapper[4749]: I1129 03:57:51.921222 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_b6b84c17-5e5b-4464-9890-31bb49853d6d/sg-core/0.log" Nov 29 03:57:52 crc kubenswrapper[4749]: I1129 03:57:52.040950 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-kbbt2_28ba02e8-574d-4047-8c56-edbcea634220/ceph-client-openstack-openstack-cell1/0.log" Nov 29 03:57:52 crc kubenswrapper[4749]: I1129 03:57:52.661444 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_17155680-b07e-462b-a201-4823c5613f54/cinder-api/0.log" Nov 29 03:57:52 crc kubenswrapper[4749]: I1129 03:57:52.739498 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_17155680-b07e-462b-a201-4823c5613f54/cinder-api-log/0.log" Nov 29 03:57:52 crc kubenswrapper[4749]: I1129 03:57:52.921506 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a36dabb0-136d-40c7-b9b1-7174cd3ba355/probe/0.log" Nov 29 03:57:53 crc kubenswrapper[4749]: I1129 03:57:53.026032 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a36dabb0-136d-40c7-b9b1-7174cd3ba355/cinder-backup/0.log" Nov 29 03:57:53 crc kubenswrapper[4749]: I1129 03:57:53.032005 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b12b8c41-e630-4f71-bc3f-24fdd2b25a5c/cinder-scheduler/0.log" Nov 29 03:57:53 crc kubenswrapper[4749]: I1129 03:57:53.175781 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b12b8c41-e630-4f71-bc3f-24fdd2b25a5c/probe/0.log" Nov 29 03:57:53 crc kubenswrapper[4749]: I1129 03:57:53.305114 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_f1bad501-c950-4fad-b698-59f5ad3f3e63/probe/0.log" Nov 29 03:57:53 crc kubenswrapper[4749]: I1129 03:57:53.342157 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_f1bad501-c950-4fad-b698-59f5ad3f3e63/cinder-volume/0.log" Nov 29 03:57:53 crc kubenswrapper[4749]: I1129 03:57:53.775789 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-gg88z_8aa7ec69-393b-44ce-90f2-4efb3812bbb9/configure-os-openstack-openstack-cell1/0.log" Nov 29 03:57:53 crc kubenswrapper[4749]: I1129 03:57:53.795449 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-vkzv6_74859cb5-3819-4f6a-8eae-b82e47e0f7e4/configure-network-openstack-openstack-cell1/0.log" Nov 29 03:57:54 crc kubenswrapper[4749]: I1129 03:57:54.581528 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-697db9564f-g8dnz_6f7ce334-1fd4-4745-b003-8291d6592f93/init/0.log" Nov 29 03:57:54 crc kubenswrapper[4749]: I1129 03:57:54.752407 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-697db9564f-g8dnz_6f7ce334-1fd4-4745-b003-8291d6592f93/init/0.log" Nov 29 03:57:54 crc kubenswrapper[4749]: I1129 03:57:54.804661 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-w4srh_f5944a87-7112-4372-b615-59ae77bec28b/download-cache-openstack-openstack-cell1/0.log" Nov 29 03:57:54 crc kubenswrapper[4749]: I1129 03:57:54.814939 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-697db9564f-g8dnz_6f7ce334-1fd4-4745-b003-8291d6592f93/dnsmasq-dns/0.log" Nov 29 03:57:54 crc kubenswrapper[4749]: I1129 03:57:54.997536 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a0d1f66a-70df-410a-9b29-bd418f5ba498/glance-log/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.000894 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a0d1f66a-70df-410a-9b29-bd418f5ba498/glance-httpd/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.071451 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ab76b9bf-6cf3-47b8-8356-32da5b8b939a/glance-log/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.147191 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ab76b9bf-6cf3-47b8-8356-32da5b8b939a/glance-httpd/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.294015 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-74ddcf9444-kzl5n_79248744-29f0-43bd-b44a-a4b8c42aae39/heat-api/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.409360 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-588c68cbfd-bccbx_1fda202d-bbc6-494b-89ff-e49cff899f83/heat-cfnapi/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.434052 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5c7858b669-qhgk5_29f760d9-0335-46d5-b098-1df1cf5067e0/heat-engine/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.630342 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-bjfr2_443e1ef6-779b-44ec-9f24-6a661a47a0a6/install-certs-openstack-openstack-cell1/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.712732 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56d7497497-ck5ws_20808d24-277c-4c49-8c37-42d5e337cb3b/horizon/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.737239 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56d7497497-ck5ws_20808d24-277c-4c49-8c37-42d5e337cb3b/horizon-log/0.log" Nov 29 03:57:55 crc kubenswrapper[4749]: I1129 03:57:55.824977 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-tm2vg_79934424-e3be-4b95-843c-65b7f2bcb76f/install-os-openstack-openstack-cell1/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.059563 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29406421-hn8jz_9347885d-8de5-4420-a979-c96f8e80d931/keystone-cron/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.086412 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-595f8844b9-gpbrs_80d0cbf8-416f-4012-9123-58c4921deb36/keystone-api/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.249181 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_55eaf704-2521-4487-a458-f38da05c48fc/kube-state-metrics/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.251952 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-ms5ld_402e3628-3422-4ebd-b3aa-aa8b36553f92/libvirt-openstack-openstack-cell1/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.473789 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_460434e5-fafa-4ef1-b56d-f266f28c6a76/manila-api-log/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.491132 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_460434e5-fafa-4ef1-b56d-f266f28c6a76/manila-api/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.540887 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_4fb28a84-0189-40d7-9be0-5de128a0290c/manila-scheduler/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.599756 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_4fb28a84-0189-40d7-9be0-5de128a0290c/probe/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.750474 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_e8bba430-1fea-4c72-a7cd-bdb8f6d91533/manila-share/0.log" Nov 29 03:57:56 crc kubenswrapper[4749]: I1129 03:57:56.772053 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_e8bba430-1fea-4c72-a7cd-bdb8f6d91533/probe/0.log" Nov 29 03:57:57 crc kubenswrapper[4749]: I1129 03:57:57.189989 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-766b69ccd5-fhfd7_cf9ee58f-5bd3-42b4-9004-699db5c01c70/neutron-api/0.log" Nov 29 03:57:57 crc kubenswrapper[4749]: I1129 03:57:57.195770 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-766b69ccd5-fhfd7_cf9ee58f-5bd3-42b4-9004-699db5c01c70/neutron-httpd/0.log" Nov 29 03:57:57 crc kubenswrapper[4749]: I1129 03:57:57.490347 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-jl4rt_89a85fb9-4f52-4edb-a999-f5e373694943/neutron-dhcp-openstack-openstack-cell1/0.log" Nov 29 03:57:57 crc kubenswrapper[4749]: I1129 03:57:57.535629 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-c9ztz_0d40b0dd-b6a5-482b-87f9-d2780c14f322/neutron-metadata-openstack-openstack-cell1/0.log" Nov 29 03:57:57 crc kubenswrapper[4749]: I1129 03:57:57.789963 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-gnwc6_8e10fa10-d91b-497f-801e-2b6093ebdb8d/neutron-sriov-openstack-openstack-cell1/0.log" Nov 29 03:57:57 crc kubenswrapper[4749]: I1129 03:57:57.907926 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_10e95981-686e-470b-b989-aedec673798b/nova-api-api/0.log" Nov 29 03:57:57 crc kubenswrapper[4749]: I1129 03:57:57.959602 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_10e95981-686e-470b-b989-aedec673798b/nova-api-log/0.log" Nov 29 03:57:58 crc kubenswrapper[4749]: I1129 03:57:58.130836 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b31dbaba-8a6f-4e88-8b0f-4d3ea583ef7f/nova-cell0-conductor-conductor/0.log" Nov 29 03:57:58 crc kubenswrapper[4749]: I1129 03:57:58.272847 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9987255c-ae1e-412a-b2c0-f0043906ccd3/nova-cell1-conductor-conductor/0.log" Nov 29 03:57:58 crc kubenswrapper[4749]: I1129 03:57:58.427606 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_cbf6f457-e2a8-4503-8120-81ef8237ef59/nova-cell1-novncproxy-novncproxy/0.log" Nov 29 03:57:58 crc kubenswrapper[4749]: I1129 03:57:58.577965 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-celltgxhg_c468b431-5762-4449-8467-64844ca96b2d/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Nov 29 03:57:58 crc kubenswrapper[4749]: I1129 03:57:58.807265 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-wv7hb_ec28e2f8-bbc8-4808-8f15-e314e66ef4ec/nova-cell1-openstack-openstack-cell1/0.log" Nov 29 03:57:58 crc kubenswrapper[4749]: I1129 03:57:58.852309 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b4cc0ff4-0c4e-4ad0-8e15-8758c486221d/nova-metadata-log/0.log" Nov 29 03:57:58 crc kubenswrapper[4749]: I1129 03:57:58.946491 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b4cc0ff4-0c4e-4ad0-8e15-8758c486221d/nova-metadata-metadata/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.159233 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-549d54fc68-lh9xg_aabf9b86-851e-47f7-9591-a2526d225e62/init/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.256983 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_06cfd47d-b269-4f6e-aea7-aaa037d7375b/nova-scheduler-scheduler/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.407830 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-549d54fc68-lh9xg_aabf9b86-851e-47f7-9591-a2526d225e62/init/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.421910 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-549d54fc68-lh9xg_aabf9b86-851e-47f7-9591-a2526d225e62/octavia-api-provider-agent/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.628336 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-549d54fc68-lh9xg_aabf9b86-851e-47f7-9591-a2526d225e62/octavia-api/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.629005 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-kx7r9_cecd151e-d11b-4784-95b1-2af60d6017a6/init/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.815899 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-kx7r9_cecd151e-d11b-4784-95b1-2af60d6017a6/init/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.868565 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-n672w_91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c/init/0.log" Nov 29 03:57:59 crc kubenswrapper[4749]: I1129 03:57:59.901306 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-kx7r9_cecd151e-d11b-4784-95b1-2af60d6017a6/octavia-healthmanager/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.074851 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:58:00 crc kubenswrapper[4749]: E1129 03:58:00.078371 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.094255 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-82cgw_9423ac5b-2562-42f0-b428-7970fead8108/init/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.099549 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-n672w_91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c/octavia-housekeeping/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.161843 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-n672w_91dc96dc-0ef3-4fd5-b324-016cd3d1ba4c/init/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.397321 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-82cgw_9423ac5b-2562-42f0-b428-7970fead8108/octavia-rsyslog/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.446931 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-82cgw_9423ac5b-2562-42f0-b428-7970fead8108/init/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.468233 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-xddrk_f261d77c-0a22-41ae-9b6f-7a43382b8ca8/init/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.646568 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-xddrk_f261d77c-0a22-41ae-9b6f-7a43382b8ca8/init/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.810703 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-xddrk_f261d77c-0a22-41ae-9b6f-7a43382b8ca8/octavia-worker/0.log" Nov 29 03:58:00 crc kubenswrapper[4749]: I1129 03:58:00.886011 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_203e2625-2d50-46c8-b781-5f7fd1304777/mysql-bootstrap/0.log" Nov 29 03:58:01 crc kubenswrapper[4749]: I1129 03:58:01.286016 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_203e2625-2d50-46c8-b781-5f7fd1304777/mysql-bootstrap/0.log" Nov 29 03:58:01 crc kubenswrapper[4749]: I1129 03:58:01.380752 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0aadf221-92e1-4c07-8f50-4a5e503b8870/mysql-bootstrap/0.log" Nov 29 03:58:01 crc kubenswrapper[4749]: I1129 03:58:01.411997 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_203e2625-2d50-46c8-b781-5f7fd1304777/galera/0.log" Nov 29 03:58:01 crc kubenswrapper[4749]: I1129 03:58:01.597724 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0aadf221-92e1-4c07-8f50-4a5e503b8870/mysql-bootstrap/0.log" Nov 29 03:58:01 crc kubenswrapper[4749]: I1129 03:58:01.608131 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a76d2e70-2758-463e-b25f-6cd80067450a/openstackclient/0.log" Nov 29 03:58:01 crc kubenswrapper[4749]: I1129 03:58:01.676291 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0aadf221-92e1-4c07-8f50-4a5e503b8870/galera/0.log" Nov 29 03:58:02 crc kubenswrapper[4749]: I1129 03:58:02.262179 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6fbp7_4af332b0-069f-4cf1-976a-076483cfe432/openstack-network-exporter/0.log" Nov 29 03:58:02 crc kubenswrapper[4749]: I1129 03:58:02.330719 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7bnqw_2d20ab0c-d80d-4dd5-98d2-8f09ec505527/ovsdb-server-init/0.log" Nov 29 03:58:02 crc kubenswrapper[4749]: I1129 03:58:02.623740 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7bnqw_2d20ab0c-d80d-4dd5-98d2-8f09ec505527/ovsdb-server-init/0.log" Nov 29 03:58:02 crc kubenswrapper[4749]: I1129 03:58:02.655778 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7bnqw_2d20ab0c-d80d-4dd5-98d2-8f09ec505527/ovs-vswitchd/0.log" Nov 29 03:58:02 crc kubenswrapper[4749]: I1129 03:58:02.718941 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-7bnqw_2d20ab0c-d80d-4dd5-98d2-8f09ec505527/ovsdb-server/0.log" Nov 29 03:58:02 crc kubenswrapper[4749]: I1129 03:58:02.871727 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_adc9ee4b-6d85-4100-8d10-64163bf250c0/openstack-network-exporter/0.log" Nov 29 03:58:02 crc kubenswrapper[4749]: I1129 03:58:02.887124 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-z2brk_0cffba40-2751-4ab1-a9b5-9c8c041a83f8/ovn-controller/0.log" Nov 29 03:58:02 crc kubenswrapper[4749]: I1129 03:58:02.944758 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_adc9ee4b-6d85-4100-8d10-64163bf250c0/ovn-northd/0.log" Nov 29 03:58:03 crc kubenswrapper[4749]: I1129 03:58:03.221397 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6665f4d3-5e86-4e4e-af41-9574adad9b2d/openstack-network-exporter/0.log" Nov 29 03:58:03 crc kubenswrapper[4749]: I1129 03:58:03.253992 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-dlzxk_033a5a84-a10e-4a5f-a7f5-de6f348f5b32/ovn-openstack-openstack-cell1/0.log" Nov 29 03:58:03 crc kubenswrapper[4749]: I1129 03:58:03.344659 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6665f4d3-5e86-4e4e-af41-9574adad9b2d/ovsdbserver-nb/0.log" Nov 29 03:58:03 crc kubenswrapper[4749]: I1129 03:58:03.437414 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_3a416ad5-b3da-4bd9-949f-23485a7d2647/ovsdbserver-nb/0.log" Nov 29 03:58:03 crc kubenswrapper[4749]: I1129 03:58:03.472431 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_3a416ad5-b3da-4bd9-949f-23485a7d2647/openstack-network-exporter/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.201243 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_30163aa0-30e4-4c0e-a703-47bb8a18bf07/openstack-network-exporter/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.219693 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_30163aa0-30e4-4c0e-a703-47bb8a18bf07/ovsdbserver-nb/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.228627 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a12be827-924f-4ff4-8fba-c2e78d1222d0/openstack-network-exporter/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.389955 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_a12be827-924f-4ff4-8fba-c2e78d1222d0/ovsdbserver-sb/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.438925 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_a4d50cec-a30f-49a1-857b-3181d5d1e632/openstack-network-exporter/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.470052 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_a4d50cec-a30f-49a1-857b-3181d5d1e632/ovsdbserver-sb/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.699241 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_5a5702b1-caee-424f-b2ba-e62faf326574/openstack-network-exporter/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.750577 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_5a5702b1-caee-424f-b2ba-e62faf326574/ovsdbserver-sb/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.949123 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-567fc5ff4d-d2cnq_bd4b1978-3d21-4daa-89b8-292d5e4cdf9e/placement-api/0.log" Nov 29 03:58:04 crc kubenswrapper[4749]: I1129 03:58:04.971697 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-567fc5ff4d-d2cnq_bd4b1978-3d21-4daa-89b8-292d5e4cdf9e/placement-log/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.029165 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cgbhxs_1dfc44dd-ef6d-44d9-9bcf-e1a28cdb71d6/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.206979 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4bc6897f-040b-48d5-ad08-eec6d6f8f671/init-config-reloader/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.398044 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4bc6897f-040b-48d5-ad08-eec6d6f8f671/init-config-reloader/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.401539 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4bc6897f-040b-48d5-ad08-eec6d6f8f671/config-reloader/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.413387 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4bc6897f-040b-48d5-ad08-eec6d6f8f671/prometheus/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.458588 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_4bc6897f-040b-48d5-ad08-eec6d6f8f671/thanos-sidecar/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.620949 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a0af1f7d-07fd-485b-ba0e-57d4e2a1c781/setup-container/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.843341 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a0af1f7d-07fd-485b-ba0e-57d4e2a1c781/rabbitmq/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.911825 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a0af1f7d-07fd-485b-ba0e-57d4e2a1c781/setup-container/0.log" Nov 29 03:58:05 crc kubenswrapper[4749]: I1129 03:58:05.918342 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b55c421-1415-42f3-a604-93a5405aa469/setup-container/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.071837 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b55c421-1415-42f3-a604-93a5405aa469/setup-container/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.073326 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-s6bfv_59ea7fe7-a726-4e18-bd71-11070ae29d0a/reboot-os-openstack-openstack-cell1/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.231509 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0564ef85-5416-4525-8e67-55cd54992646/memcached/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.306327 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-wf5c7_0d96903e-8ce9-4112-abcf-0151817e99a8/run-os-openstack-openstack-cell1/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.436798 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-8mjpf_1ef348e7-3d36-45b4-90f8-582d82bc0d4a/ssh-known-hosts-openstack/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.441408 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b55c421-1415-42f3-a604-93a5405aa469/rabbitmq/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.546221 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-npfts_0b460379-aed8-40bb-b56e-f20fc64761bf/telemetry-openstack-openstack-cell1/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.606744 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-5j4t5_234ff02d-f844-46c2-9a13-9cc6ce370926/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Nov 29 03:58:06 crc kubenswrapper[4749]: I1129 03:58:06.734059 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-5t9tc_8a0dca15-f50b-4ac0-9d64-052462449692/validate-network-openstack-openstack-cell1/0.log" Nov 29 03:58:13 crc kubenswrapper[4749]: I1129 03:58:13.075971 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:58:13 crc kubenswrapper[4749]: E1129 03:58:13.076795 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:58:28 crc kubenswrapper[4749]: I1129 03:58:28.076596 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:58:28 crc kubenswrapper[4749]: E1129 03:58:28.077171 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.168327 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw_ecba3528-c9f0-4ec0-8b76-34aad22f4d4b/util/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.331838 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw_ecba3528-c9f0-4ec0-8b76-34aad22f4d4b/pull/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.338417 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw_ecba3528-c9f0-4ec0-8b76-34aad22f4d4b/util/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.347282 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw_ecba3528-c9f0-4ec0-8b76-34aad22f4d4b/pull/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.477396 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw_ecba3528-c9f0-4ec0-8b76-34aad22f4d4b/util/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.518570 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw_ecba3528-c9f0-4ec0-8b76-34aad22f4d4b/pull/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.530657 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2fb0bf10c288d38d92533277ae476ba044b2096e3ccc8333e5a505d237wgxgw_ecba3528-c9f0-4ec0-8b76-34aad22f4d4b/extract/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.699986 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-5jr7h_3f250151-87d8-495b-895a-c43205c7b8ce/kube-rbac-proxy/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.781581 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-6xbh8_08e10646-6c79-42a1-8180-b2f7595e73ce/kube-rbac-proxy/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.793984 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-5jr7h_3f250151-87d8-495b-895a-c43205c7b8ce/manager/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.961769 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-x7lmd_3bbc7cc2-0efd-4d6f-b424-d1558ed9f040/kube-rbac-proxy/0.log" Nov 29 03:58:30 crc kubenswrapper[4749]: I1129 03:58:30.976362 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-6xbh8_08e10646-6c79-42a1-8180-b2f7595e73ce/manager/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.036110 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-x7lmd_3bbc7cc2-0efd-4d6f-b424-d1558ed9f040/manager/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.147081 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-wm9q9_9e0d34d5-9c78-4c5b-8081-e076cde59208/kube-rbac-proxy/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.384906 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-668d9c48b9-wm9q9_9e0d34d5-9c78-4c5b-8081-e076cde59208/manager/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.437543 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-s5vxs_ae863e3f-87c3-4712-9e4d-5fcfa63df10b/manager/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.439401 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-s5vxs_ae863e3f-87c3-4712-9e4d-5fcfa63df10b/kube-rbac-proxy/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.610522 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-ww9sb_0ba380f8-eaae-4987-add4-bdd6aa96f090/kube-rbac-proxy/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.676407 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-ww9sb_0ba380f8-eaae-4987-add4-bdd6aa96f090/manager/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.696042 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-l9m8x_c707c92a-5aaa-40ca-a7ae-5ee5db538c3c/kube-rbac-proxy/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.906485 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-662nk_ee9e0c71-281c-41b2-b566-c0222b456f23/kube-rbac-proxy/0.log" Nov 29 03:58:31 crc kubenswrapper[4749]: I1129 03:58:31.944543 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-662nk_ee9e0c71-281c-41b2-b566-c0222b456f23/manager/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.128894 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-l9m8x_c707c92a-5aaa-40ca-a7ae-5ee5db538c3c/manager/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.163147 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-5c65k_b2daf909-0247-4a43-a96a-a136e5268260/kube-rbac-proxy/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.284396 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-546d4bdf48-5c65k_b2daf909-0247-4a43-a96a-a136e5268260/manager/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.379567 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-f499g_d0754b9d-ce96-4174-83f2-c4436e7d8195/manager/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.383520 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6546668bfd-f499g_d0754b9d-ce96-4174-83f2-c4436e7d8195/kube-rbac-proxy/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.492689 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-56rwq_8609ba03-25af-49c9-b521-8c637dab5e91/kube-rbac-proxy/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.608838 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-56rwq_8609ba03-25af-49c9-b521-8c637dab5e91/manager/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.673373 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-l5s92_9ed58501-79d8-4626-bd9f-dae8a95c872c/kube-rbac-proxy/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.756637 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-l5s92_9ed58501-79d8-4626-bd9f-dae8a95c872c/manager/0.log" Nov 29 03:58:32 crc kubenswrapper[4749]: I1129 03:58:32.839249 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7hch7_2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f/kube-rbac-proxy/0.log" Nov 29 03:58:33 crc kubenswrapper[4749]: I1129 03:58:33.020598 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-8lkfs_2ad01dbb-582f-4074-a985-76067fc2bed3/kube-rbac-proxy/0.log" Nov 29 03:58:33 crc kubenswrapper[4749]: I1129 03:58:33.074942 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-7hch7_2b7ed7c9-c3c7-498a-adb4-9ee1dcc0093f/manager/0.log" Nov 29 03:58:33 crc kubenswrapper[4749]: I1129 03:58:33.101915 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-8lkfs_2ad01dbb-582f-4074-a985-76067fc2bed3/manager/0.log" Nov 29 03:58:33 crc kubenswrapper[4749]: I1129 03:58:33.269330 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m_76c23b91-6df4-41e0-bcd3-eacc7e879aeb/kube-rbac-proxy/0.log" Nov 29 03:58:33 crc kubenswrapper[4749]: I1129 03:58:33.280733 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4zr82m_76c23b91-6df4-41e0-bcd3-eacc7e879aeb/manager/0.log" Nov 29 03:58:34 crc kubenswrapper[4749]: I1129 03:58:34.188579 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mc697_ec8927f6-9db7-4af0-b09b-ecb2e7ebade2/registry-server/0.log" Nov 29 03:58:34 crc kubenswrapper[4749]: I1129 03:58:34.335172 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6dbf9ff7bd-zvr22_24610c1d-41d1-42a1-8aa1-654cf868a283/operator/0.log" Nov 29 03:58:34 crc kubenswrapper[4749]: I1129 03:58:34.510329 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2kq6d_b08fc4d5-cf16-49c3-b95d-e9175ab67846/kube-rbac-proxy/0.log" Nov 29 03:58:34 crc kubenswrapper[4749]: I1129 03:58:34.564561 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2kq6d_b08fc4d5-cf16-49c3-b95d-e9175ab67846/manager/0.log" Nov 29 03:58:34 crc kubenswrapper[4749]: I1129 03:58:34.579358 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-dm22j_14d6b00a-a750-4bc4-9d78-12dcefeafe6b/kube-rbac-proxy/0.log" Nov 29 03:58:34 crc kubenswrapper[4749]: I1129 03:58:34.843544 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-kcnjp_872c8278-f904-4ef0-8180-46fd4beea0dd/operator/0.log" Nov 29 03:58:34 crc kubenswrapper[4749]: I1129 03:58:34.851287 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-dm22j_14d6b00a-a750-4bc4-9d78-12dcefeafe6b/manager/0.log" Nov 29 03:58:34 crc kubenswrapper[4749]: I1129 03:58:34.906897 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-ldkwb_beb586a3-ac88-42b7-b080-8b68cb73bf53/kube-rbac-proxy/0.log" Nov 29 03:58:35 crc kubenswrapper[4749]: I1129 03:58:35.064397 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-fg7sg_a493c4bc-b7d4-4e55-bc8f-205242be99eb/kube-rbac-proxy/0.log" Nov 29 03:58:35 crc kubenswrapper[4749]: I1129 03:58:35.112129 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-ldkwb_beb586a3-ac88-42b7-b080-8b68cb73bf53/manager/0.log" Nov 29 03:58:35 crc kubenswrapper[4749]: I1129 03:58:35.290716 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-ssmp5_913736f1-2790-4ac3-a478-58de73caee8f/kube-rbac-proxy/0.log" Nov 29 03:58:35 crc kubenswrapper[4749]: I1129 03:58:35.317470 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-fg7sg_a493c4bc-b7d4-4e55-bc8f-205242be99eb/manager/0.log" Nov 29 03:58:35 crc kubenswrapper[4749]: I1129 03:58:35.361916 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-ssmp5_913736f1-2790-4ac3-a478-58de73caee8f/manager/0.log" Nov 29 03:58:36 crc kubenswrapper[4749]: I1129 03:58:36.111411 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6jvjq_88cd9373-83ec-44e6-b108-04d0b853b5da/manager/0.log" Nov 29 03:58:36 crc kubenswrapper[4749]: I1129 03:58:36.168242 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-6jvjq_88cd9373-83ec-44e6-b108-04d0b853b5da/kube-rbac-proxy/0.log" Nov 29 03:58:36 crc kubenswrapper[4749]: I1129 03:58:36.592698 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6d7d7b9964-wws7p_be884b00-8556-44c8-83e8-c851267b63e2/manager/0.log" Nov 29 03:58:43 crc kubenswrapper[4749]: I1129 03:58:43.075659 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:58:43 crc kubenswrapper[4749]: E1129 03:58:43.076471 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:58:57 crc kubenswrapper[4749]: I1129 03:58:57.226351 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-hj2jl_78e8e60a-3e83-41f4-8e5d-e502d05118ac/control-plane-machine-set-operator/0.log" Nov 29 03:58:57 crc kubenswrapper[4749]: I1129 03:58:57.362453 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2mp4r_a19644e5-e46c-4286-8f46-5022e2bb45b4/kube-rbac-proxy/0.log" Nov 29 03:58:57 crc kubenswrapper[4749]: I1129 03:58:57.392025 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-2mp4r_a19644e5-e46c-4286-8f46-5022e2bb45b4/machine-api-operator/0.log" Nov 29 03:58:58 crc kubenswrapper[4749]: I1129 03:58:58.076114 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:58:58 crc kubenswrapper[4749]: E1129 03:58:58.076610 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:59:11 crc kubenswrapper[4749]: I1129 03:59:11.561031 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-4fxz8_516a7074-ce39-4245-848d-8fc40b801000/cert-manager-controller/0.log" Nov 29 03:59:11 crc kubenswrapper[4749]: I1129 03:59:11.725164 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-7ztw7_2d746906-3777-4543-a8c1-8bb4ff61fa60/cert-manager-cainjector/0.log" Nov 29 03:59:11 crc kubenswrapper[4749]: I1129 03:59:11.753305 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-h9fjh_04dcd0b8-9e22-4ffc-b291-d89de26d9afe/cert-manager-webhook/0.log" Nov 29 03:59:13 crc kubenswrapper[4749]: I1129 03:59:13.075556 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:59:13 crc kubenswrapper[4749]: E1129 03:59:13.076358 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:59:26 crc kubenswrapper[4749]: I1129 03:59:26.501312 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-4vlld_02baa9c1-affa-4d66-afac-5c8bd20bf097/nmstate-console-plugin/0.log" Nov 29 03:59:26 crc kubenswrapper[4749]: I1129 03:59:26.618100 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-h75fj_0b80b7b3-a0dc-488b-9431-2016284ab8af/nmstate-handler/0.log" Nov 29 03:59:26 crc kubenswrapper[4749]: I1129 03:59:26.706534 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qsmwj_50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe/nmstate-metrics/0.log" Nov 29 03:59:26 crc kubenswrapper[4749]: I1129 03:59:26.719769 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qsmwj_50eb0349-3cd4-44cf-93ad-76f0ed8ef9fe/kube-rbac-proxy/0.log" Nov 29 03:59:26 crc kubenswrapper[4749]: I1129 03:59:26.936023 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-dskn2_21426103-df8a-47f8-ac5d-60c843e56c3d/nmstate-operator/0.log" Nov 29 03:59:26 crc kubenswrapper[4749]: I1129 03:59:26.958827 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-7jn4q_e66f7428-ba60-42fc-92b6-b45c1d974b8b/nmstate-webhook/0.log" Nov 29 03:59:27 crc kubenswrapper[4749]: I1129 03:59:27.082147 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:59:27 crc kubenswrapper[4749]: E1129 03:59:27.082431 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:59:41 crc kubenswrapper[4749]: I1129 03:59:41.075030 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:59:41 crc kubenswrapper[4749]: E1129 03:59:41.075801 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:59:45 crc kubenswrapper[4749]: I1129 03:59:45.471439 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-5z7ln_ba1f212b-3cc2-4f6e-9b71-443f17d0e113/kube-rbac-proxy/0.log" Nov 29 03:59:45 crc kubenswrapper[4749]: I1129 03:59:45.707074 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-frr-files/0.log" Nov 29 03:59:45 crc kubenswrapper[4749]: I1129 03:59:45.993928 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-frr-files/0.log" Nov 29 03:59:45 crc kubenswrapper[4749]: I1129 03:59:45.999303 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-5z7ln_ba1f212b-3cc2-4f6e-9b71-443f17d0e113/controller/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.001256 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-reloader/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.057620 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-metrics/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.173895 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-reloader/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.566839 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-metrics/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.586726 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-frr-files/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.587053 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-metrics/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.629392 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-reloader/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.799596 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-reloader/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.800353 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-frr-files/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.805581 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/cp-metrics/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.840785 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/controller/0.log" Nov 29 03:59:46 crc kubenswrapper[4749]: I1129 03:59:46.998739 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/kube-rbac-proxy/0.log" Nov 29 03:59:47 crc kubenswrapper[4749]: I1129 03:59:47.006112 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/kube-rbac-proxy-frr/0.log" Nov 29 03:59:47 crc kubenswrapper[4749]: I1129 03:59:47.034662 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/frr-metrics/0.log" Nov 29 03:59:47 crc kubenswrapper[4749]: I1129 03:59:47.262947 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/reloader/0.log" Nov 29 03:59:47 crc kubenswrapper[4749]: I1129 03:59:47.327775 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-kkp2m_cf49ecc1-5b2d-4b0b-a06e-0193e60947cd/frr-k8s-webhook-server/0.log" Nov 29 03:59:47 crc kubenswrapper[4749]: I1129 03:59:47.494394 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b479d995c-252wh_ef286966-9492-4971-a5a1-072fd0de42e6/manager/0.log" Nov 29 03:59:47 crc kubenswrapper[4749]: I1129 03:59:47.737095 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-68569fcffb-jj942_7240a7b6-7fc6-4960-8db0-55e75820bd36/webhook-server/0.log" Nov 29 03:59:47 crc kubenswrapper[4749]: I1129 03:59:47.817396 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tdt6d_1bd292ff-8eb0-4ed0-95cd-6ba367873d7a/kube-rbac-proxy/0.log" Nov 29 03:59:48 crc kubenswrapper[4749]: I1129 03:59:48.882246 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gpl65"] Nov 29 03:59:48 crc kubenswrapper[4749]: E1129 03:59:48.882663 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8f362c-e447-4bb0-b04b-13abefba9933" containerName="container-00" Nov 29 03:59:48 crc kubenswrapper[4749]: I1129 03:59:48.882676 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8f362c-e447-4bb0-b04b-13abefba9933" containerName="container-00" Nov 29 03:59:48 crc kubenswrapper[4749]: I1129 03:59:48.882890 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8f362c-e447-4bb0-b04b-13abefba9933" containerName="container-00" Nov 29 03:59:48 crc kubenswrapper[4749]: I1129 03:59:48.890422 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:48 crc kubenswrapper[4749]: I1129 03:59:48.897692 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gpl65"] Nov 29 03:59:48 crc kubenswrapper[4749]: I1129 03:59:48.964845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c48kd\" (UniqueName: \"kubernetes.io/projected/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-kube-api-access-c48kd\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:48 crc kubenswrapper[4749]: I1129 03:59:48.965125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-utilities\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:48 crc kubenswrapper[4749]: I1129 03:59:48.965594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-catalog-content\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:49 crc kubenswrapper[4749]: I1129 03:59:49.067598 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-catalog-content\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:49 crc kubenswrapper[4749]: I1129 03:59:49.067666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c48kd\" (UniqueName: \"kubernetes.io/projected/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-kube-api-access-c48kd\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:49 crc kubenswrapper[4749]: I1129 03:59:49.067748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-utilities\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:49 crc kubenswrapper[4749]: I1129 03:59:49.068163 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-catalog-content\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:49 crc kubenswrapper[4749]: I1129 03:59:49.068173 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-utilities\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:49 crc kubenswrapper[4749]: I1129 03:59:49.089682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c48kd\" (UniqueName: \"kubernetes.io/projected/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-kube-api-access-c48kd\") pod \"redhat-operators-gpl65\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:49 crc kubenswrapper[4749]: I1129 03:59:49.222680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 03:59:51 crc kubenswrapper[4749]: I1129 03:59:51.034009 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gpl65"] Nov 29 03:59:51 crc kubenswrapper[4749]: I1129 03:59:51.389044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpl65" event={"ID":"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d","Type":"ContainerStarted","Data":"13edd743bb965a0ceb51fefb0db01ee25178cf0228ef9fdc3af20acbc1eda550"} Nov 29 03:59:51 crc kubenswrapper[4749]: I1129 03:59:51.426732 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-tdt6d_1bd292ff-8eb0-4ed0-95cd-6ba367873d7a/speaker/0.log" Nov 29 03:59:52 crc kubenswrapper[4749]: I1129 03:59:52.074946 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 03:59:52 crc kubenswrapper[4749]: E1129 03:59:52.075560 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mnsct_openshift-machine-config-operator(800b3936-ba93-47d8-9417-2fdc5ce4d171)\"" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" Nov 29 03:59:53 crc kubenswrapper[4749]: I1129 03:59:53.337039 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rb7rz_6937d960-a5b2-45b6-99cb-1f6ed6e0563a/frr/0.log" Nov 29 03:59:53 crc kubenswrapper[4749]: I1129 03:59:53.406745 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerID="3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870" exitCode=0 Nov 29 03:59:53 crc kubenswrapper[4749]: I1129 03:59:53.406782 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpl65" event={"ID":"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d","Type":"ContainerDied","Data":"3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870"} Nov 29 03:59:53 crc kubenswrapper[4749]: I1129 03:59:53.408436 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 29 03:59:55 crc kubenswrapper[4749]: I1129 03:59:55.435445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpl65" event={"ID":"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d","Type":"ContainerStarted","Data":"e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477"} Nov 29 03:59:58 crc kubenswrapper[4749]: I1129 03:59:58.485353 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerID="e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477" exitCode=0 Nov 29 03:59:58 crc kubenswrapper[4749]: I1129 03:59:58.485771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpl65" event={"ID":"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d","Type":"ContainerDied","Data":"e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477"} Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.744154 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f8czm"] Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.747375 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.759419 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8czm"] Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.921588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-utilities\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.922286 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-catalog-content\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.922456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znv9s\" (UniqueName: \"kubernetes.io/projected/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-kube-api-access-znv9s\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.942586 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sfl9h"] Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.946322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfl9h" Nov 29 03:59:59 crc kubenswrapper[4749]: I1129 03:59:59.976927 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfl9h"] Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.024407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znv9s\" (UniqueName: \"kubernetes.io/projected/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-kube-api-access-znv9s\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.024821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-utilities\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.024964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-catalog-content\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.025480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-catalog-content\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.026047 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-utilities\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.127815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-catalog-content\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.128148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-utilities\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.128313 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l66tk\" (UniqueName: \"kubernetes.io/projected/5115c6de-1bcf-4098-b834-9411a31ff7a9-kube-api-access-l66tk\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.182141 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55"] Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.184025 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.186908 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.187134 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.199632 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znv9s\" (UniqueName: \"kubernetes.io/projected/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-kube-api-access-znv9s\") pod \"redhat-marketplace-f8czm\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.204113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55"] Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.259912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l66tk\" (UniqueName: \"kubernetes.io/projected/5115c6de-1bcf-4098-b834-9411a31ff7a9-kube-api-access-l66tk\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.260100 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-catalog-content\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.260184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-utilities\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.260616 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-utilities\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.263048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-catalog-content\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.291042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l66tk\" (UniqueName: \"kubernetes.io/projected/5115c6de-1bcf-4098-b834-9411a31ff7a9-kube-api-access-l66tk\") pod \"community-operators-sfl9h\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.362931 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxs7\" (UniqueName: \"kubernetes.io/projected/46118562-742a-471e-bbb6-dfa7a82cc6e4-kube-api-access-dgxs7\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.363267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46118562-742a-471e-bbb6-dfa7a82cc6e4-config-volume\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.363408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46118562-742a-471e-bbb6-dfa7a82cc6e4-secret-volume\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.373390 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.466955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxs7\" (UniqueName: \"kubernetes.io/projected/46118562-742a-471e-bbb6-dfa7a82cc6e4-kube-api-access-dgxs7\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.467106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46118562-742a-471e-bbb6-dfa7a82cc6e4-config-volume\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.467149 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46118562-742a-471e-bbb6-dfa7a82cc6e4-secret-volume\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.468338 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46118562-742a-471e-bbb6-dfa7a82cc6e4-config-volume\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.471735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46118562-742a-471e-bbb6-dfa7a82cc6e4-secret-volume\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.501104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxs7\" (UniqueName: \"kubernetes.io/projected/46118562-742a-471e-bbb6-dfa7a82cc6e4-kube-api-access-dgxs7\") pod \"collect-profiles-29406480-z4g55\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.602612 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.672676 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:00 crc kubenswrapper[4749]: I1129 04:00:00.952116 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8czm"] Nov 29 04:00:01 crc kubenswrapper[4749]: W1129 04:00:01.175645 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46118562_742a_471e_bbb6_dfa7a82cc6e4.slice/crio-d768b1f214854f63a2a80435e94a4cb2632f8226095aba1260d5925c4d8539e5 WatchSource:0}: Error finding container d768b1f214854f63a2a80435e94a4cb2632f8226095aba1260d5925c4d8539e5: Status 404 returned error can't find the container with id d768b1f214854f63a2a80435e94a4cb2632f8226095aba1260d5925c4d8539e5 Nov 29 04:00:01 crc kubenswrapper[4749]: I1129 04:00:01.176892 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55"] Nov 29 04:00:01 crc kubenswrapper[4749]: W1129 04:00:01.184428 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5115c6de_1bcf_4098_b834_9411a31ff7a9.slice/crio-12aa4b19b855751615865998509870929e0fd293932ec5a2d226730940baabde WatchSource:0}: Error finding container 12aa4b19b855751615865998509870929e0fd293932ec5a2d226730940baabde: Status 404 returned error can't find the container with id 12aa4b19b855751615865998509870929e0fd293932ec5a2d226730940baabde Nov 29 04:00:01 crc kubenswrapper[4749]: I1129 04:00:01.186241 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sfl9h"] Nov 29 04:00:01 crc kubenswrapper[4749]: I1129 04:00:01.526239 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" event={"ID":"46118562-742a-471e-bbb6-dfa7a82cc6e4","Type":"ContainerStarted","Data":"d768b1f214854f63a2a80435e94a4cb2632f8226095aba1260d5925c4d8539e5"} Nov 29 04:00:01 crc kubenswrapper[4749]: I1129 04:00:01.527236 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfl9h" event={"ID":"5115c6de-1bcf-4098-b834-9411a31ff7a9","Type":"ContainerStarted","Data":"12aa4b19b855751615865998509870929e0fd293932ec5a2d226730940baabde"} Nov 29 04:00:01 crc kubenswrapper[4749]: I1129 04:00:01.528293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8czm" event={"ID":"dc28e07b-0a3c-42ac-8d98-1acc8da528ff","Type":"ContainerStarted","Data":"ad39183419b117a6f463e188f724fe8ead6d0df9d3fdd08c7b1603dbe267dee9"} Nov 29 04:00:02 crc kubenswrapper[4749]: I1129 04:00:02.941801 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5zrrf"] Nov 29 04:00:02 crc kubenswrapper[4749]: I1129 04:00:02.950177 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:02 crc kubenswrapper[4749]: I1129 04:00:02.968050 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zrrf"] Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.044491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mqxm\" (UniqueName: \"kubernetes.io/projected/ae935673-9bf7-43a1-9750-0b524875788f-kube-api-access-5mqxm\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.044636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-catalog-content\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.044672 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-utilities\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.146733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-catalog-content\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.147069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-utilities\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.147265 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mqxm\" (UniqueName: \"kubernetes.io/projected/ae935673-9bf7-43a1-9750-0b524875788f-kube-api-access-5mqxm\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.147534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-utilities\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.147536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-catalog-content\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.166381 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mqxm\" (UniqueName: \"kubernetes.io/projected/ae935673-9bf7-43a1-9750-0b524875788f-kube-api-access-5mqxm\") pod \"certified-operators-5zrrf\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.266957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.573653 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" event={"ID":"46118562-742a-471e-bbb6-dfa7a82cc6e4","Type":"ContainerStarted","Data":"6629c839994f98b2db505533d18d63c82f3908d4ed35b3a52f46eb8ad8032366"} Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.575155 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfl9h" event={"ID":"5115c6de-1bcf-4098-b834-9411a31ff7a9","Type":"ContainerStarted","Data":"17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079"} Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.587413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8czm" event={"ID":"dc28e07b-0a3c-42ac-8d98-1acc8da528ff","Type":"ContainerStarted","Data":"fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627"} Nov 29 04:00:03 crc kubenswrapper[4749]: I1129 04:00:03.833191 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5zrrf"] Nov 29 04:00:04 crc kubenswrapper[4749]: I1129 04:00:04.600762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrrf" event={"ID":"ae935673-9bf7-43a1-9750-0b524875788f","Type":"ContainerStarted","Data":"21e276ddea466094b5ab9f4781209c936b7e9073394f00162ae3313f67a636ab"} Nov 29 04:00:05 crc kubenswrapper[4749]: I1129 04:00:05.075708 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 04:00:05 crc kubenswrapper[4749]: I1129 04:00:05.428582 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7_54610f30-edc2-4aed-b76f-ba1ce5451386/util/0.log" Nov 29 04:00:05 crc kubenswrapper[4749]: I1129 04:00:05.614951 4749 generic.go:334] "Generic (PLEG): container finished" podID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerID="17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079" exitCode=0 Nov 29 04:00:05 crc kubenswrapper[4749]: I1129 04:00:05.615030 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfl9h" event={"ID":"5115c6de-1bcf-4098-b834-9411a31ff7a9","Type":"ContainerDied","Data":"17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079"} Nov 29 04:00:05 crc kubenswrapper[4749]: I1129 04:00:05.616858 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerID="fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627" exitCode=0 Nov 29 04:00:05 crc kubenswrapper[4749]: I1129 04:00:05.616882 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8czm" event={"ID":"dc28e07b-0a3c-42ac-8d98-1acc8da528ff","Type":"ContainerDied","Data":"fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627"} Nov 29 04:00:05 crc kubenswrapper[4749]: I1129 04:00:05.815459 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7_54610f30-edc2-4aed-b76f-ba1ce5451386/util/0.log" Nov 29 04:00:05 crc kubenswrapper[4749]: I1129 04:00:05.884761 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7_54610f30-edc2-4aed-b76f-ba1ce5451386/util/0.log" Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.195073 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7_54610f30-edc2-4aed-b76f-ba1ce5451386/pull/0.log" Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.195684 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7_54610f30-edc2-4aed-b76f-ba1ce5451386/pull/0.log" Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.201190 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7_54610f30-edc2-4aed-b76f-ba1ce5451386/pull/0.log" Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.375137 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aspnw7_54610f30-edc2-4aed-b76f-ba1ce5451386/extract/0.log" Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.459316 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s_994ad8e8-6c08-4674-a8dd-715d8c8f1e5b/util/0.log" Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.632292 4749 generic.go:334] "Generic (PLEG): container finished" podID="46118562-742a-471e-bbb6-dfa7a82cc6e4" containerID="6629c839994f98b2db505533d18d63c82f3908d4ed35b3a52f46eb8ad8032366" exitCode=0 Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.632351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" event={"ID":"46118562-742a-471e-bbb6-dfa7a82cc6e4","Type":"ContainerDied","Data":"6629c839994f98b2db505533d18d63c82f3908d4ed35b3a52f46eb8ad8032366"} Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.634935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"577fb5ad0e14eb75e16df5dc241a22829818fdfa2374e8c0dcd5dc8df5c702c1"} Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.638352 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae935673-9bf7-43a1-9750-0b524875788f" containerID="443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316" exitCode=0 Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.639854 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrrf" event={"ID":"ae935673-9bf7-43a1-9750-0b524875788f","Type":"ContainerDied","Data":"443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316"} Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.703847 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s_994ad8e8-6c08-4674-a8dd-715d8c8f1e5b/util/0.log" Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.791952 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s_994ad8e8-6c08-4674-a8dd-715d8c8f1e5b/pull/0.log" Nov 29 04:00:06 crc kubenswrapper[4749]: I1129 04:00:06.793844 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s_994ad8e8-6c08-4674-a8dd-715d8c8f1e5b/pull/0.log" Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.016697 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s_994ad8e8-6c08-4674-a8dd-715d8c8f1e5b/util/0.log" Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.029116 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s_994ad8e8-6c08-4674-a8dd-715d8c8f1e5b/extract/0.log" Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.092713 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f69j7s_994ad8e8-6c08-4674-a8dd-715d8c8f1e5b/pull/0.log" Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.247597 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5_41b635b5-bb08-4a95-b27e-56d1ea63ffc3/util/0.log" Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.660602 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpl65" event={"ID":"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d","Type":"ContainerStarted","Data":"af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a"} Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.688930 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gpl65" podStartSLOduration=6.653264041 podStartE2EDuration="19.688905707s" podCreationTimestamp="2025-11-29 03:59:48 +0000 UTC" firstStartedPulling="2025-11-29 03:59:53.40816662 +0000 UTC m=+10136.580316477" lastFinishedPulling="2025-11-29 04:00:06.443808286 +0000 UTC m=+10149.615958143" observedRunningTime="2025-11-29 04:00:07.681260041 +0000 UTC m=+10150.853409898" watchObservedRunningTime="2025-11-29 04:00:07.688905707 +0000 UTC m=+10150.861055564" Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.728740 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5_41b635b5-bb08-4a95-b27e-56d1ea63ffc3/pull/0.log" Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.730311 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5_41b635b5-bb08-4a95-b27e-56d1ea63ffc3/pull/0.log" Nov 29 04:00:07 crc kubenswrapper[4749]: I1129 04:00:07.829529 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5_41b635b5-bb08-4a95-b27e-56d1ea63ffc3/util/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.133023 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.166995 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5_41b635b5-bb08-4a95-b27e-56d1ea63ffc3/extract/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.192907 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5_41b635b5-bb08-4a95-b27e-56d1ea63ffc3/util/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.192972 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgxs7\" (UniqueName: \"kubernetes.io/projected/46118562-742a-471e-bbb6-dfa7a82cc6e4-kube-api-access-dgxs7\") pod \"46118562-742a-471e-bbb6-dfa7a82cc6e4\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.193104 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46118562-742a-471e-bbb6-dfa7a82cc6e4-config-volume\") pod \"46118562-742a-471e-bbb6-dfa7a82cc6e4\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.193177 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46118562-742a-471e-bbb6-dfa7a82cc6e4-secret-volume\") pod \"46118562-742a-471e-bbb6-dfa7a82cc6e4\" (UID: \"46118562-742a-471e-bbb6-dfa7a82cc6e4\") " Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.194768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46118562-742a-471e-bbb6-dfa7a82cc6e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "46118562-742a-471e-bbb6-dfa7a82cc6e4" (UID: "46118562-742a-471e-bbb6-dfa7a82cc6e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.199731 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46118562-742a-471e-bbb6-dfa7a82cc6e4-kube-api-access-dgxs7" (OuterVolumeSpecName: "kube-api-access-dgxs7") pod "46118562-742a-471e-bbb6-dfa7a82cc6e4" (UID: "46118562-742a-471e-bbb6-dfa7a82cc6e4"). InnerVolumeSpecName "kube-api-access-dgxs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.200326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46118562-742a-471e-bbb6-dfa7a82cc6e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "46118562-742a-471e-bbb6-dfa7a82cc6e4" (UID: "46118562-742a-471e-bbb6-dfa7a82cc6e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.235928 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92105lgt5_41b635b5-bb08-4a95-b27e-56d1ea63ffc3/pull/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.295843 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgxs7\" (UniqueName: \"kubernetes.io/projected/46118562-742a-471e-bbb6-dfa7a82cc6e4-kube-api-access-dgxs7\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.295875 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46118562-742a-471e-bbb6-dfa7a82cc6e4-config-volume\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.295886 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/46118562-742a-471e-bbb6-dfa7a82cc6e4-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.490479 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c_d8ea565f-9a8a-48f1-aba9-d603fcf591c4/util/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.687928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8czm" event={"ID":"dc28e07b-0a3c-42ac-8d98-1acc8da528ff","Type":"ContainerStarted","Data":"8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189"} Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.702865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrrf" event={"ID":"ae935673-9bf7-43a1-9750-0b524875788f","Type":"ContainerStarted","Data":"1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18"} Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.709213 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" event={"ID":"46118562-742a-471e-bbb6-dfa7a82cc6e4","Type":"ContainerDied","Data":"d768b1f214854f63a2a80435e94a4cb2632f8226095aba1260d5925c4d8539e5"} Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.709253 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d768b1f214854f63a2a80435e94a4cb2632f8226095aba1260d5925c4d8539e5" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.709317 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29406480-z4g55" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.714448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfl9h" event={"ID":"5115c6de-1bcf-4098-b834-9411a31ff7a9","Type":"ContainerStarted","Data":"9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf"} Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.751372 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c_d8ea565f-9a8a-48f1-aba9-d603fcf591c4/pull/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: E1129 04:00:08.774882 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46118562_742a_471e_bbb6_dfa7a82cc6e4.slice\": RecentStats: unable to find data in memory cache]" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.878918 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c_d8ea565f-9a8a-48f1-aba9-d603fcf591c4/pull/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.942460 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c_d8ea565f-9a8a-48f1-aba9-d603fcf591c4/util/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.972080 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c_d8ea565f-9a8a-48f1-aba9-d603fcf591c4/pull/0.log" Nov 29 04:00:08 crc kubenswrapper[4749]: I1129 04:00:08.998053 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c_d8ea565f-9a8a-48f1-aba9-d603fcf591c4/util/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.006220 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83s765c_d8ea565f-9a8a-48f1-aba9-d603fcf591c4/extract/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.207294 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-48gh2_ec2edf35-170e-4586-8e6f-c563db51b6b7/extract-utilities/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.222462 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7"] Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.223260 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.223290 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.234066 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29406435-cqtm7"] Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.407761 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-48gh2_ec2edf35-170e-4586-8e6f-c563db51b6b7/extract-content/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.407863 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-48gh2_ec2edf35-170e-4586-8e6f-c563db51b6b7/extract-utilities/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.440786 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-48gh2_ec2edf35-170e-4586-8e6f-c563db51b6b7/extract-content/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.600252 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-48gh2_ec2edf35-170e-4586-8e6f-c563db51b6b7/extract-content/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.645016 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-48gh2_ec2edf35-170e-4586-8e6f-c563db51b6b7/extract-utilities/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.661317 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5zrrf_ae935673-9bf7-43a1-9750-0b524875788f/extract-utilities/0.log" Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.724149 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerID="8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189" exitCode=0 Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.724486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8czm" event={"ID":"dc28e07b-0a3c-42ac-8d98-1acc8da528ff","Type":"ContainerDied","Data":"8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189"} Nov 29 04:00:09 crc kubenswrapper[4749]: I1129 04:00:09.999826 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5zrrf_ae935673-9bf7-43a1-9750-0b524875788f/extract-content/0.log" Nov 29 04:00:10 crc kubenswrapper[4749]: I1129 04:00:10.023254 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5zrrf_ae935673-9bf7-43a1-9750-0b524875788f/extract-utilities/0.log" Nov 29 04:00:10 crc kubenswrapper[4749]: I1129 04:00:10.036917 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5zrrf_ae935673-9bf7-43a1-9750-0b524875788f/extract-content/0.log" Nov 29 04:00:10 crc kubenswrapper[4749]: I1129 04:00:10.276598 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gpl65" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="registry-server" probeResult="failure" output=< Nov 29 04:00:10 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 04:00:10 crc kubenswrapper[4749]: > Nov 29 04:00:10 crc kubenswrapper[4749]: I1129 04:00:10.400275 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5zrrf_ae935673-9bf7-43a1-9750-0b524875788f/extract-utilities/0.log" Nov 29 04:00:10 crc kubenswrapper[4749]: I1129 04:00:10.407558 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5zrrf_ae935673-9bf7-43a1-9750-0b524875788f/extract-content/0.log" Nov 29 04:00:11 crc kubenswrapper[4749]: I1129 04:00:11.093794 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3803407e-f7af-4747-82ad-ecb8b23db732" path="/var/lib/kubelet/pods/3803407e-f7af-4747-82ad-ecb8b23db732/volumes" Nov 29 04:00:11 crc kubenswrapper[4749]: I1129 04:00:11.205391 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgq68_7d47bb92-91e4-4b99-9c6a-86ec5c95396a/extract-utilities/0.log" Nov 29 04:00:11 crc kubenswrapper[4749]: I1129 04:00:11.454970 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgq68_7d47bb92-91e4-4b99-9c6a-86ec5c95396a/extract-content/0.log" Nov 29 04:00:11 crc kubenswrapper[4749]: I1129 04:00:11.468892 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgq68_7d47bb92-91e4-4b99-9c6a-86ec5c95396a/extract-utilities/0.log" Nov 29 04:00:11 crc kubenswrapper[4749]: I1129 04:00:11.485413 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgq68_7d47bb92-91e4-4b99-9c6a-86ec5c95396a/extract-content/0.log" Nov 29 04:00:11 crc kubenswrapper[4749]: I1129 04:00:11.729313 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgq68_7d47bb92-91e4-4b99-9c6a-86ec5c95396a/extract-utilities/0.log" Nov 29 04:00:11 crc kubenswrapper[4749]: I1129 04:00:11.732214 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgq68_7d47bb92-91e4-4b99-9c6a-86ec5c95396a/extract-content/0.log" Nov 29 04:00:11 crc kubenswrapper[4749]: I1129 04:00:11.896187 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfl9h_5115c6de-1bcf-4098-b834-9411a31ff7a9/extract-utilities/0.log" Nov 29 04:00:12 crc kubenswrapper[4749]: I1129 04:00:12.171039 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfl9h_5115c6de-1bcf-4098-b834-9411a31ff7a9/extract-utilities/0.log" Nov 29 04:00:12 crc kubenswrapper[4749]: I1129 04:00:12.173635 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfl9h_5115c6de-1bcf-4098-b834-9411a31ff7a9/extract-content/0.log" Nov 29 04:00:12 crc kubenswrapper[4749]: I1129 04:00:12.180655 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfl9h_5115c6de-1bcf-4098-b834-9411a31ff7a9/extract-content/0.log" Nov 29 04:00:12 crc kubenswrapper[4749]: I1129 04:00:12.397584 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfl9h_5115c6de-1bcf-4098-b834-9411a31ff7a9/extract-content/0.log" Nov 29 04:00:12 crc kubenswrapper[4749]: I1129 04:00:12.400775 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-sfl9h_5115c6de-1bcf-4098-b834-9411a31ff7a9/extract-utilities/0.log" Nov 29 04:00:13 crc kubenswrapper[4749]: I1129 04:00:13.254152 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-m564m_1b11e2f5-ae7c-4297-97cb-e217d0947051/marketplace-operator/0.log" Nov 29 04:00:13 crc kubenswrapper[4749]: I1129 04:00:13.589442 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-48gh2_ec2edf35-170e-4586-8e6f-c563db51b6b7/registry-server/0.log" Nov 29 04:00:13 crc kubenswrapper[4749]: I1129 04:00:13.757300 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f8czm_dc28e07b-0a3c-42ac-8d98-1acc8da528ff/extract-utilities/0.log" Nov 29 04:00:13 crc kubenswrapper[4749]: I1129 04:00:13.761588 4749 generic.go:334] "Generic (PLEG): container finished" podID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerID="9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf" exitCode=0 Nov 29 04:00:13 crc kubenswrapper[4749]: I1129 04:00:13.761627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfl9h" event={"ID":"5115c6de-1bcf-4098-b834-9411a31ff7a9","Type":"ContainerDied","Data":"9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf"} Nov 29 04:00:13 crc kubenswrapper[4749]: I1129 04:00:13.997832 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f8czm_dc28e07b-0a3c-42ac-8d98-1acc8da528ff/extract-content/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.002361 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f8czm_dc28e07b-0a3c-42ac-8d98-1acc8da528ff/extract-utilities/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.030502 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f8czm_dc28e07b-0a3c-42ac-8d98-1acc8da528ff/extract-content/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.291962 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f8czm_dc28e07b-0a3c-42ac-8d98-1acc8da528ff/extract-utilities/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.401496 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f8czm_dc28e07b-0a3c-42ac-8d98-1acc8da528ff/extract-content/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.556931 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvzcw_b44c30b9-6b5c-40fd-9f73-0072f941ffeb/extract-utilities/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.786544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8czm" event={"ID":"dc28e07b-0a3c-42ac-8d98-1acc8da528ff","Type":"ContainerStarted","Data":"1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de"} Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.790056 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae935673-9bf7-43a1-9750-0b524875788f" containerID="1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18" exitCode=0 Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.790089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrrf" event={"ID":"ae935673-9bf7-43a1-9750-0b524875788f","Type":"ContainerDied","Data":"1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18"} Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.803504 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f8czm" podStartSLOduration=8.411889551 podStartE2EDuration="15.803491167s" podCreationTimestamp="2025-11-29 03:59:59 +0000 UTC" firstStartedPulling="2025-11-29 04:00:06.641722226 +0000 UTC m=+10149.813872093" lastFinishedPulling="2025-11-29 04:00:14.033323852 +0000 UTC m=+10157.205473709" observedRunningTime="2025-11-29 04:00:14.802962364 +0000 UTC m=+10157.975112221" watchObservedRunningTime="2025-11-29 04:00:14.803491167 +0000 UTC m=+10157.975641024" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.844618 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cgq68_7d47bb92-91e4-4b99-9c6a-86ec5c95396a/registry-server/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.865169 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvzcw_b44c30b9-6b5c-40fd-9f73-0072f941ffeb/extract-utilities/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.903380 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvzcw_b44c30b9-6b5c-40fd-9f73-0072f941ffeb/extract-content/0.log" Nov 29 04:00:14 crc kubenswrapper[4749]: I1129 04:00:14.932254 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvzcw_b44c30b9-6b5c-40fd-9f73-0072f941ffeb/extract-content/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.193475 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gpl65_d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d/extract-utilities/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.245520 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvzcw_b44c30b9-6b5c-40fd-9f73-0072f941ffeb/extract-content/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.296104 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvzcw_b44c30b9-6b5c-40fd-9f73-0072f941ffeb/extract-utilities/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.598615 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gpl65_d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d/extract-content/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.616690 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gpl65_d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d/extract-content/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.631315 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gpl65_d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d/extract-utilities/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.718114 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rvzcw_b44c30b9-6b5c-40fd-9f73-0072f941ffeb/registry-server/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.801147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrrf" event={"ID":"ae935673-9bf7-43a1-9750-0b524875788f","Type":"ContainerStarted","Data":"4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402"} Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.803254 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfl9h" event={"ID":"5115c6de-1bcf-4098-b834-9411a31ff7a9","Type":"ContainerStarted","Data":"aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac"} Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.827367 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5zrrf" podStartSLOduration=4.978407075 podStartE2EDuration="13.827350671s" podCreationTimestamp="2025-11-29 04:00:02 +0000 UTC" firstStartedPulling="2025-11-29 04:00:06.641421898 +0000 UTC m=+10149.813571755" lastFinishedPulling="2025-11-29 04:00:15.490365494 +0000 UTC m=+10158.662515351" observedRunningTime="2025-11-29 04:00:15.820110344 +0000 UTC m=+10158.992260201" watchObservedRunningTime="2025-11-29 04:00:15.827350671 +0000 UTC m=+10158.999500528" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.838146 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sfl9h" podStartSLOduration=8.808933141 podStartE2EDuration="16.838128033s" podCreationTimestamp="2025-11-29 03:59:59 +0000 UTC" firstStartedPulling="2025-11-29 04:00:06.642261069 +0000 UTC m=+10149.814410926" lastFinishedPulling="2025-11-29 04:00:14.671455961 +0000 UTC m=+10157.843605818" observedRunningTime="2025-11-29 04:00:15.834679509 +0000 UTC m=+10159.006829366" watchObservedRunningTime="2025-11-29 04:00:15.838128033 +0000 UTC m=+10159.010277890" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.948752 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gpl65_d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d/extract-utilities/0.log" Nov 29 04:00:15 crc kubenswrapper[4749]: I1129 04:00:15.970081 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gpl65_d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d/registry-server/0.log" Nov 29 04:00:16 crc kubenswrapper[4749]: I1129 04:00:16.264709 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lk96f_ef97226b-aa25-4088-a39b-0015a132dd8c/extract-utilities/0.log" Nov 29 04:00:16 crc kubenswrapper[4749]: I1129 04:00:16.278249 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-gpl65_d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d/extract-content/0.log" Nov 29 04:00:16 crc kubenswrapper[4749]: I1129 04:00:16.492723 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lk96f_ef97226b-aa25-4088-a39b-0015a132dd8c/extract-content/0.log" Nov 29 04:00:16 crc kubenswrapper[4749]: I1129 04:00:16.527336 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lk96f_ef97226b-aa25-4088-a39b-0015a132dd8c/extract-utilities/0.log" Nov 29 04:00:16 crc kubenswrapper[4749]: I1129 04:00:16.645619 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lk96f_ef97226b-aa25-4088-a39b-0015a132dd8c/extract-content/0.log" Nov 29 04:00:16 crc kubenswrapper[4749]: I1129 04:00:16.809588 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lk96f_ef97226b-aa25-4088-a39b-0015a132dd8c/extract-content/0.log" Nov 29 04:00:16 crc kubenswrapper[4749]: I1129 04:00:16.831251 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lk96f_ef97226b-aa25-4088-a39b-0015a132dd8c/extract-utilities/0.log" Nov 29 04:00:17 crc kubenswrapper[4749]: I1129 04:00:17.876786 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lk96f_ef97226b-aa25-4088-a39b-0015a132dd8c/registry-server/0.log" Nov 29 04:00:20 crc kubenswrapper[4749]: I1129 04:00:20.278079 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gpl65" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="registry-server" probeResult="failure" output=< Nov 29 04:00:20 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 04:00:20 crc kubenswrapper[4749]: > Nov 29 04:00:20 crc kubenswrapper[4749]: I1129 04:00:20.374023 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:20 crc kubenswrapper[4749]: I1129 04:00:20.374098 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:20 crc kubenswrapper[4749]: I1129 04:00:20.442731 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:20 crc kubenswrapper[4749]: I1129 04:00:20.603294 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:20 crc kubenswrapper[4749]: I1129 04:00:20.603376 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:20 crc kubenswrapper[4749]: I1129 04:00:20.938482 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:21 crc kubenswrapper[4749]: I1129 04:00:21.675918 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-sfl9h" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="registry-server" probeResult="failure" output=< Nov 29 04:00:21 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Nov 29 04:00:21 crc kubenswrapper[4749]: > Nov 29 04:00:21 crc kubenswrapper[4749]: I1129 04:00:21.733638 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8czm"] Nov 29 04:00:22 crc kubenswrapper[4749]: I1129 04:00:22.878388 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f8czm" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerName="registry-server" containerID="cri-o://1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de" gracePeriod=2 Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.267335 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.267660 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.346746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.438329 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.588229 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-utilities\") pod \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.588392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-catalog-content\") pod \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.588480 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znv9s\" (UniqueName: \"kubernetes.io/projected/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-kube-api-access-znv9s\") pod \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\" (UID: \"dc28e07b-0a3c-42ac-8d98-1acc8da528ff\") " Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.589055 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-utilities" (OuterVolumeSpecName: "utilities") pod "dc28e07b-0a3c-42ac-8d98-1acc8da528ff" (UID: "dc28e07b-0a3c-42ac-8d98-1acc8da528ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.595492 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-kube-api-access-znv9s" (OuterVolumeSpecName: "kube-api-access-znv9s") pod "dc28e07b-0a3c-42ac-8d98-1acc8da528ff" (UID: "dc28e07b-0a3c-42ac-8d98-1acc8da528ff"). InnerVolumeSpecName "kube-api-access-znv9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.605029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc28e07b-0a3c-42ac-8d98-1acc8da528ff" (UID: "dc28e07b-0a3c-42ac-8d98-1acc8da528ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.691137 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znv9s\" (UniqueName: \"kubernetes.io/projected/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-kube-api-access-znv9s\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.691172 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.691186 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc28e07b-0a3c-42ac-8d98-1acc8da528ff-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.890423 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerID="1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de" exitCode=0 Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.890512 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8czm" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.891581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8czm" event={"ID":"dc28e07b-0a3c-42ac-8d98-1acc8da528ff","Type":"ContainerDied","Data":"1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de"} Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.891627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8czm" event={"ID":"dc28e07b-0a3c-42ac-8d98-1acc8da528ff","Type":"ContainerDied","Data":"ad39183419b117a6f463e188f724fe8ead6d0df9d3fdd08c7b1603dbe267dee9"} Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.891647 4749 scope.go:117] "RemoveContainer" containerID="1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.927153 4749 scope.go:117] "RemoveContainer" containerID="8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.934752 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8czm"] Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.951268 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8czm"] Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.956800 4749 scope.go:117] "RemoveContainer" containerID="fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627" Nov 29 04:00:23 crc kubenswrapper[4749]: I1129 04:00:23.956875 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:24 crc kubenswrapper[4749]: I1129 04:00:24.008456 4749 scope.go:117] "RemoveContainer" containerID="1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de" Nov 29 04:00:24 crc kubenswrapper[4749]: E1129 04:00:24.008859 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de\": container with ID starting with 1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de not found: ID does not exist" containerID="1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de" Nov 29 04:00:24 crc kubenswrapper[4749]: I1129 04:00:24.008899 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de"} err="failed to get container status \"1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de\": rpc error: code = NotFound desc = could not find container \"1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de\": container with ID starting with 1a5c3226339dbeb65d066e8daea0c258b8c646edec53ca9deb1aa027291a03de not found: ID does not exist" Nov 29 04:00:24 crc kubenswrapper[4749]: I1129 04:00:24.008923 4749 scope.go:117] "RemoveContainer" containerID="8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189" Nov 29 04:00:24 crc kubenswrapper[4749]: E1129 04:00:24.009350 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189\": container with ID starting with 8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189 not found: ID does not exist" containerID="8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189" Nov 29 04:00:24 crc kubenswrapper[4749]: I1129 04:00:24.009395 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189"} err="failed to get container status \"8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189\": rpc error: code = NotFound desc = could not find container \"8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189\": container with ID starting with 8e91d0d310441a64451f34908a5c638e2981080882c2d6a933538b89f8d23189 not found: ID does not exist" Nov 29 04:00:24 crc kubenswrapper[4749]: I1129 04:00:24.009422 4749 scope.go:117] "RemoveContainer" containerID="fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627" Nov 29 04:00:24 crc kubenswrapper[4749]: E1129 04:00:24.010117 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627\": container with ID starting with fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627 not found: ID does not exist" containerID="fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627" Nov 29 04:00:24 crc kubenswrapper[4749]: I1129 04:00:24.010210 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627"} err="failed to get container status \"fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627\": rpc error: code = NotFound desc = could not find container \"fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627\": container with ID starting with fae31935619f25eb5c5bd81ac48c2dc41db72365a8b818e9156af72be3e7d627 not found: ID does not exist" Nov 29 04:00:24 crc kubenswrapper[4749]: I1129 04:00:24.739800 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zrrf"] Nov 29 04:00:25 crc kubenswrapper[4749]: I1129 04:00:25.096539 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" path="/var/lib/kubelet/pods/dc28e07b-0a3c-42ac-8d98-1acc8da528ff/volumes" Nov 29 04:00:25 crc kubenswrapper[4749]: I1129 04:00:25.913659 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5zrrf" podUID="ae935673-9bf7-43a1-9750-0b524875788f" containerName="registry-server" containerID="cri-o://4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402" gracePeriod=2 Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.422043 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.553868 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-catalog-content\") pod \"ae935673-9bf7-43a1-9750-0b524875788f\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.555653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mqxm\" (UniqueName: \"kubernetes.io/projected/ae935673-9bf7-43a1-9750-0b524875788f-kube-api-access-5mqxm\") pod \"ae935673-9bf7-43a1-9750-0b524875788f\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.555750 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-utilities\") pod \"ae935673-9bf7-43a1-9750-0b524875788f\" (UID: \"ae935673-9bf7-43a1-9750-0b524875788f\") " Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.556409 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-utilities" (OuterVolumeSpecName: "utilities") pod "ae935673-9bf7-43a1-9750-0b524875788f" (UID: "ae935673-9bf7-43a1-9750-0b524875788f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.556668 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.569925 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae935673-9bf7-43a1-9750-0b524875788f-kube-api-access-5mqxm" (OuterVolumeSpecName: "kube-api-access-5mqxm") pod "ae935673-9bf7-43a1-9750-0b524875788f" (UID: "ae935673-9bf7-43a1-9750-0b524875788f"). InnerVolumeSpecName "kube-api-access-5mqxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.598420 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae935673-9bf7-43a1-9750-0b524875788f" (UID: "ae935673-9bf7-43a1-9750-0b524875788f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.658570 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae935673-9bf7-43a1-9750-0b524875788f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.658615 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mqxm\" (UniqueName: \"kubernetes.io/projected/ae935673-9bf7-43a1-9750-0b524875788f-kube-api-access-5mqxm\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.926502 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae935673-9bf7-43a1-9750-0b524875788f" containerID="4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402" exitCode=0 Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.926558 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrrf" event={"ID":"ae935673-9bf7-43a1-9750-0b524875788f","Type":"ContainerDied","Data":"4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402"} Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.926596 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5zrrf" event={"ID":"ae935673-9bf7-43a1-9750-0b524875788f","Type":"ContainerDied","Data":"21e276ddea466094b5ab9f4781209c936b7e9073394f00162ae3313f67a636ab"} Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.926604 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5zrrf" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.926617 4749 scope.go:117] "RemoveContainer" containerID="4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.954844 4749 scope.go:117] "RemoveContainer" containerID="1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.977504 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5zrrf"] Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.979543 4749 scope.go:117] "RemoveContainer" containerID="443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316" Nov 29 04:00:26 crc kubenswrapper[4749]: I1129 04:00:26.996773 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5zrrf"] Nov 29 04:00:27 crc kubenswrapper[4749]: I1129 04:00:27.058278 4749 scope.go:117] "RemoveContainer" containerID="4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402" Nov 29 04:00:27 crc kubenswrapper[4749]: E1129 04:00:27.058751 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402\": container with ID starting with 4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402 not found: ID does not exist" containerID="4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402" Nov 29 04:00:27 crc kubenswrapper[4749]: I1129 04:00:27.058793 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402"} err="failed to get container status \"4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402\": rpc error: code = NotFound desc = could not find container \"4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402\": container with ID starting with 4bb0396effdbee81868d1a29fbc6f737b6516e96a04a3a74e13d257f2a96b402 not found: ID does not exist" Nov 29 04:00:27 crc kubenswrapper[4749]: I1129 04:00:27.058825 4749 scope.go:117] "RemoveContainer" containerID="1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18" Nov 29 04:00:27 crc kubenswrapper[4749]: E1129 04:00:27.059170 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18\": container with ID starting with 1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18 not found: ID does not exist" containerID="1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18" Nov 29 04:00:27 crc kubenswrapper[4749]: I1129 04:00:27.059274 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18"} err="failed to get container status \"1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18\": rpc error: code = NotFound desc = could not find container \"1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18\": container with ID starting with 1b0e252e36e3a23f27b11acc383b246ae515648769ce5d17671f5715c189af18 not found: ID does not exist" Nov 29 04:00:27 crc kubenswrapper[4749]: I1129 04:00:27.059314 4749 scope.go:117] "RemoveContainer" containerID="443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316" Nov 29 04:00:27 crc kubenswrapper[4749]: E1129 04:00:27.059739 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316\": container with ID starting with 443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316 not found: ID does not exist" containerID="443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316" Nov 29 04:00:27 crc kubenswrapper[4749]: I1129 04:00:27.059814 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316"} err="failed to get container status \"443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316\": rpc error: code = NotFound desc = could not find container \"443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316\": container with ID starting with 443c0f6fa20a4d58831bcd6b2e69d704da28dcdc484b6ba43a120bb9c2654316 not found: ID does not exist" Nov 29 04:00:27 crc kubenswrapper[4749]: I1129 04:00:27.096001 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae935673-9bf7-43a1-9750-0b524875788f" path="/var/lib/kubelet/pods/ae935673-9bf7-43a1-9750-0b524875788f/volumes" Nov 29 04:00:29 crc kubenswrapper[4749]: I1129 04:00:29.211457 4749 scope.go:117] "RemoveContainer" containerID="fc68129932f81586db52b11e911281fb3fe4ceb4e7cb7bb86578d5c97d2376cd" Nov 29 04:00:29 crc kubenswrapper[4749]: I1129 04:00:29.285466 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 04:00:29 crc kubenswrapper[4749]: I1129 04:00:29.364910 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 04:00:30 crc kubenswrapper[4749]: I1129 04:00:30.678064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:30 crc kubenswrapper[4749]: I1129 04:00:30.738302 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gpl65"] Nov 29 04:00:30 crc kubenswrapper[4749]: I1129 04:00:30.759042 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:30 crc kubenswrapper[4749]: I1129 04:00:30.973329 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gpl65" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="registry-server" containerID="cri-o://af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a" gracePeriod=2 Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.550033 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.586147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-catalog-content\") pod \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.586740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c48kd\" (UniqueName: \"kubernetes.io/projected/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-kube-api-access-c48kd\") pod \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.586841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-utilities\") pod \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\" (UID: \"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d\") " Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.587896 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-utilities" (OuterVolumeSpecName: "utilities") pod "d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" (UID: "d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.592953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-kube-api-access-c48kd" (OuterVolumeSpecName: "kube-api-access-c48kd") pod "d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" (UID: "d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d"). InnerVolumeSpecName "kube-api-access-c48kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.689389 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c48kd\" (UniqueName: \"kubernetes.io/projected/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-kube-api-access-c48kd\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.689427 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.699913 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" (UID: "d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.790963 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.986054 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerID="af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a" exitCode=0 Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.986117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpl65" event={"ID":"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d","Type":"ContainerDied","Data":"af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a"} Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.986146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gpl65" event={"ID":"d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d","Type":"ContainerDied","Data":"13edd743bb965a0ceb51fefb0db01ee25178cf0228ef9fdc3af20acbc1eda550"} Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.986169 4749 scope.go:117] "RemoveContainer" containerID="af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a" Nov 29 04:00:31 crc kubenswrapper[4749]: I1129 04:00:31.986372 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gpl65" Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.023274 4749 scope.go:117] "RemoveContainer" containerID="e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477" Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.029355 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gpl65"] Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.038331 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gpl65"] Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.059316 4749 scope.go:117] "RemoveContainer" containerID="3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870" Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.148516 4749 scope.go:117] "RemoveContainer" containerID="af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a" Nov 29 04:00:32 crc kubenswrapper[4749]: E1129 04:00:32.148893 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a\": container with ID starting with af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a not found: ID does not exist" containerID="af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a" Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.148931 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a"} err="failed to get container status \"af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a\": rpc error: code = NotFound desc = could not find container \"af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a\": container with ID starting with af00c7675e4d697aace9544e80e510f4d8a68540aa181bcfcade69dbf98ed07a not found: ID does not exist" Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.148951 4749 scope.go:117] "RemoveContainer" containerID="e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477" Nov 29 04:00:32 crc kubenswrapper[4749]: E1129 04:00:32.149236 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477\": container with ID starting with e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477 not found: ID does not exist" containerID="e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477" Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.149282 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477"} err="failed to get container status \"e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477\": rpc error: code = NotFound desc = could not find container \"e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477\": container with ID starting with e0d3efd275f299c1a62aca38891caa39c2186362b29b5373063c690a433ee477 not found: ID does not exist" Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.149310 4749 scope.go:117] "RemoveContainer" containerID="3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870" Nov 29 04:00:32 crc kubenswrapper[4749]: E1129 04:00:32.149617 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870\": container with ID starting with 3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870 not found: ID does not exist" containerID="3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870" Nov 29 04:00:32 crc kubenswrapper[4749]: I1129 04:00:32.149645 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870"} err="failed to get container status \"3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870\": rpc error: code = NotFound desc = could not find container \"3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870\": container with ID starting with 3ea1f788bcb6d537774dcafe6d6eb93bf7698a512cb0a275518b62663b003870 not found: ID does not exist" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.085251 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" path="/var/lib/kubelet/pods/d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d/volumes" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.131393 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfl9h"] Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.131612 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sfl9h" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="registry-server" containerID="cri-o://aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac" gracePeriod=2 Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.183004 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-2hprp_3605df88-08f3-4d94-bed9-34339855602a/prometheus-operator/0.log" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.448492 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f75bf8f86-jfw7f_9aae6a67-cff1-4a25-b1b9-eecf82756432/prometheus-operator-admission-webhook/0.log" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.559672 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f75bf8f86-kj2kh_f261a062-257e-44a5-8f06-05d682c51638/prometheus-operator-admission-webhook/0.log" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.714095 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-xchfm_9f2b11e6-22ff-4b26-a9d3-52241843dde7/operator/0.log" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.741740 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.762386 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-rqm5f_bc37afd3-a5b8-42ba-9749-dd7435506f70/perses-operator/0.log" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.831299 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l66tk\" (UniqueName: \"kubernetes.io/projected/5115c6de-1bcf-4098-b834-9411a31ff7a9-kube-api-access-l66tk\") pod \"5115c6de-1bcf-4098-b834-9411a31ff7a9\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.831416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-catalog-content\") pod \"5115c6de-1bcf-4098-b834-9411a31ff7a9\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.831526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-utilities\") pod \"5115c6de-1bcf-4098-b834-9411a31ff7a9\" (UID: \"5115c6de-1bcf-4098-b834-9411a31ff7a9\") " Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.834102 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-utilities" (OuterVolumeSpecName: "utilities") pod "5115c6de-1bcf-4098-b834-9411a31ff7a9" (UID: "5115c6de-1bcf-4098-b834-9411a31ff7a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.841707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5115c6de-1bcf-4098-b834-9411a31ff7a9-kube-api-access-l66tk" (OuterVolumeSpecName: "kube-api-access-l66tk") pod "5115c6de-1bcf-4098-b834-9411a31ff7a9" (UID: "5115c6de-1bcf-4098-b834-9411a31ff7a9"). InnerVolumeSpecName "kube-api-access-l66tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.886770 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5115c6de-1bcf-4098-b834-9411a31ff7a9" (UID: "5115c6de-1bcf-4098-b834-9411a31ff7a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.934436 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.934472 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5115c6de-1bcf-4098-b834-9411a31ff7a9-utilities\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:33 crc kubenswrapper[4749]: I1129 04:00:33.934483 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l66tk\" (UniqueName: \"kubernetes.io/projected/5115c6de-1bcf-4098-b834-9411a31ff7a9-kube-api-access-l66tk\") on node \"crc\" DevicePath \"\"" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.009586 4749 generic.go:334] "Generic (PLEG): container finished" podID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerID="aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac" exitCode=0 Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.009702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfl9h" event={"ID":"5115c6de-1bcf-4098-b834-9411a31ff7a9","Type":"ContainerDied","Data":"aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac"} Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.010139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sfl9h" event={"ID":"5115c6de-1bcf-4098-b834-9411a31ff7a9","Type":"ContainerDied","Data":"12aa4b19b855751615865998509870929e0fd293932ec5a2d226730940baabde"} Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.010232 4749 scope.go:117] "RemoveContainer" containerID="aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.009801 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sfl9h" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.029497 4749 scope.go:117] "RemoveContainer" containerID="9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.050382 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sfl9h"] Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.062027 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sfl9h"] Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.813081 4749 scope.go:117] "RemoveContainer" containerID="17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.835845 4749 scope.go:117] "RemoveContainer" containerID="aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac" Nov 29 04:00:34 crc kubenswrapper[4749]: E1129 04:00:34.836476 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac\": container with ID starting with aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac not found: ID does not exist" containerID="aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.836536 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac"} err="failed to get container status \"aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac\": rpc error: code = NotFound desc = could not find container \"aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac\": container with ID starting with aff0c0ba97ba8a190a8cf4f05285f2667b7f57908e496f163136af4d01c573ac not found: ID does not exist" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.836583 4749 scope.go:117] "RemoveContainer" containerID="9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf" Nov 29 04:00:34 crc kubenswrapper[4749]: E1129 04:00:34.836968 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf\": container with ID starting with 9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf not found: ID does not exist" containerID="9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.837012 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf"} err="failed to get container status \"9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf\": rpc error: code = NotFound desc = could not find container \"9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf\": container with ID starting with 9da09ee83af80e4dd3f57b8aeb407e96dca2a23b79a32234583dbf9fcd8f6cdf not found: ID does not exist" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.837039 4749 scope.go:117] "RemoveContainer" containerID="17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079" Nov 29 04:00:34 crc kubenswrapper[4749]: E1129 04:00:34.837449 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079\": container with ID starting with 17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079 not found: ID does not exist" containerID="17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079" Nov 29 04:00:34 crc kubenswrapper[4749]: I1129 04:00:34.837480 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079"} err="failed to get container status \"17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079\": rpc error: code = NotFound desc = could not find container \"17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079\": container with ID starting with 17021036d4926982070ff6fbd8eca4b475fadfb5988af8ec7e8c3825dee11079 not found: ID does not exist" Nov 29 04:00:35 crc kubenswrapper[4749]: I1129 04:00:35.091463 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" path="/var/lib/kubelet/pods/5115c6de-1bcf-4098-b834-9411a31ff7a9/volumes" Nov 29 04:00:57 crc kubenswrapper[4749]: E1129 04:00:57.447172 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:49182->38.102.83.30:35737: write tcp 38.102.83.30:49182->38.102.83.30:35737: write: broken pipe Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.167191 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29406481-9tcrb"] Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168323 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="extract-utilities" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168347 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="extract-utilities" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168372 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae935673-9bf7-43a1-9750-0b524875788f" containerName="extract-utilities" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168380 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae935673-9bf7-43a1-9750-0b524875788f" containerName="extract-utilities" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168405 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168414 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168429 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="extract-utilities" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168437 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="extract-utilities" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168457 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168465 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168483 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae935673-9bf7-43a1-9750-0b524875788f" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168491 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae935673-9bf7-43a1-9750-0b524875788f" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168507 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168516 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168532 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="extract-content" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168540 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="extract-content" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168550 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerName="extract-content" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168559 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerName="extract-content" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168571 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerName="extract-utilities" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168581 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerName="extract-utilities" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168602 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46118562-742a-471e-bbb6-dfa7a82cc6e4" containerName="collect-profiles" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168610 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="46118562-742a-471e-bbb6-dfa7a82cc6e4" containerName="collect-profiles" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168627 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae935673-9bf7-43a1-9750-0b524875788f" containerName="extract-content" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168635 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae935673-9bf7-43a1-9750-0b524875788f" containerName="extract-content" Nov 29 04:01:00 crc kubenswrapper[4749]: E1129 04:01:00.168648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="extract-content" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168656 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="extract-content" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168950 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="46118562-742a-471e-bbb6-dfa7a82cc6e4" containerName="collect-profiles" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168972 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae935673-9bf7-43a1-9750-0b524875788f" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.168993 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b13b8d-2a08-43fb-a6e7-8d91aa41fd6d" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.169008 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc28e07b-0a3c-42ac-8d98-1acc8da528ff" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.169039 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5115c6de-1bcf-4098-b834-9411a31ff7a9" containerName="registry-server" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.169944 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.176624 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29406481-9tcrb"] Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.290209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-fernet-keys\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.290338 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gtth\" (UniqueName: \"kubernetes.io/projected/e1ebac84-8841-4e49-9ce4-60f98eda4326-kube-api-access-9gtth\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.290421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-config-data\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.290636 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-combined-ca-bundle\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.392624 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gtth\" (UniqueName: \"kubernetes.io/projected/e1ebac84-8841-4e49-9ce4-60f98eda4326-kube-api-access-9gtth\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.393081 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-config-data\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.393181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-combined-ca-bundle\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.393377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-fernet-keys\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.401912 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-combined-ca-bundle\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.402212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-config-data\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.412069 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-fernet-keys\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.413152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gtth\" (UniqueName: \"kubernetes.io/projected/e1ebac84-8841-4e49-9ce4-60f98eda4326-kube-api-access-9gtth\") pod \"keystone-cron-29406481-9tcrb\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:00 crc kubenswrapper[4749]: I1129 04:01:00.494704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:01 crc kubenswrapper[4749]: I1129 04:01:01.006663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29406481-9tcrb"] Nov 29 04:01:01 crc kubenswrapper[4749]: I1129 04:01:01.329760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406481-9tcrb" event={"ID":"e1ebac84-8841-4e49-9ce4-60f98eda4326","Type":"ContainerStarted","Data":"b16744722d3d74761c03bf17f5eb9ec6f98ff650282976031f9757041f304445"} Nov 29 04:01:01 crc kubenswrapper[4749]: I1129 04:01:01.330047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406481-9tcrb" event={"ID":"e1ebac84-8841-4e49-9ce4-60f98eda4326","Type":"ContainerStarted","Data":"fb6ad10594de6b8a173f95759c106e217c5c3ad16bcdbd02cb41cbf158a4ec3b"} Nov 29 04:01:01 crc kubenswrapper[4749]: I1129 04:01:01.348721 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29406481-9tcrb" podStartSLOduration=1.3487068629999999 podStartE2EDuration="1.348706863s" podCreationTimestamp="2025-11-29 04:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-29 04:01:01.347667868 +0000 UTC m=+10204.519817725" watchObservedRunningTime="2025-11-29 04:01:01.348706863 +0000 UTC m=+10204.520856721" Nov 29 04:01:03 crc kubenswrapper[4749]: I1129 04:01:03.350152 4749 generic.go:334] "Generic (PLEG): container finished" podID="e1ebac84-8841-4e49-9ce4-60f98eda4326" containerID="b16744722d3d74761c03bf17f5eb9ec6f98ff650282976031f9757041f304445" exitCode=0 Nov 29 04:01:03 crc kubenswrapper[4749]: I1129 04:01:03.350338 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406481-9tcrb" event={"ID":"e1ebac84-8841-4e49-9ce4-60f98eda4326","Type":"ContainerDied","Data":"b16744722d3d74761c03bf17f5eb9ec6f98ff650282976031f9757041f304445"} Nov 29 04:01:04 crc kubenswrapper[4749]: I1129 04:01:04.860687 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.014704 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-combined-ca-bundle\") pod \"e1ebac84-8841-4e49-9ce4-60f98eda4326\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.015056 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gtth\" (UniqueName: \"kubernetes.io/projected/e1ebac84-8841-4e49-9ce4-60f98eda4326-kube-api-access-9gtth\") pod \"e1ebac84-8841-4e49-9ce4-60f98eda4326\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.015136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-fernet-keys\") pod \"e1ebac84-8841-4e49-9ce4-60f98eda4326\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.015183 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-config-data\") pod \"e1ebac84-8841-4e49-9ce4-60f98eda4326\" (UID: \"e1ebac84-8841-4e49-9ce4-60f98eda4326\") " Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.023336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e1ebac84-8841-4e49-9ce4-60f98eda4326" (UID: "e1ebac84-8841-4e49-9ce4-60f98eda4326"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.023888 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ebac84-8841-4e49-9ce4-60f98eda4326-kube-api-access-9gtth" (OuterVolumeSpecName: "kube-api-access-9gtth") pod "e1ebac84-8841-4e49-9ce4-60f98eda4326" (UID: "e1ebac84-8841-4e49-9ce4-60f98eda4326"). InnerVolumeSpecName "kube-api-access-9gtth". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.073687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1ebac84-8841-4e49-9ce4-60f98eda4326" (UID: "e1ebac84-8841-4e49-9ce4-60f98eda4326"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.110961 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-config-data" (OuterVolumeSpecName: "config-data") pod "e1ebac84-8841-4e49-9ce4-60f98eda4326" (UID: "e1ebac84-8841-4e49-9ce4-60f98eda4326"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.118532 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gtth\" (UniqueName: \"kubernetes.io/projected/e1ebac84-8841-4e49-9ce4-60f98eda4326-kube-api-access-9gtth\") on node \"crc\" DevicePath \"\"" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.118579 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.118627 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-config-data\") on node \"crc\" DevicePath \"\"" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.118642 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ebac84-8841-4e49-9ce4-60f98eda4326-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.378544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29406481-9tcrb" event={"ID":"e1ebac84-8841-4e49-9ce4-60f98eda4326","Type":"ContainerDied","Data":"fb6ad10594de6b8a173f95759c106e217c5c3ad16bcdbd02cb41cbf158a4ec3b"} Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.378964 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb6ad10594de6b8a173f95759c106e217c5c3ad16bcdbd02cb41cbf158a4ec3b" Nov 29 04:01:05 crc kubenswrapper[4749]: I1129 04:01:05.378572 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29406481-9tcrb" Nov 29 04:02:25 crc kubenswrapper[4749]: I1129 04:02:25.373530 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 04:02:25 crc kubenswrapper[4749]: I1129 04:02:25.374261 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 04:02:49 crc kubenswrapper[4749]: I1129 04:02:49.927980 4749 generic.go:334] "Generic (PLEG): container finished" podID="a97b28cd-23db-4aac-91de-8d4008cb0384" containerID="6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9" exitCode=0 Nov 29 04:02:49 crc kubenswrapper[4749]: I1129 04:02:49.928078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" event={"ID":"a97b28cd-23db-4aac-91de-8d4008cb0384","Type":"ContainerDied","Data":"6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9"} Nov 29 04:02:49 crc kubenswrapper[4749]: I1129 04:02:49.930365 4749 scope.go:117] "RemoveContainer" containerID="6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9" Nov 29 04:02:50 crc kubenswrapper[4749]: I1129 04:02:50.918485 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlwsx_must-gather-bmrnf_a97b28cd-23db-4aac-91de-8d4008cb0384/gather/0.log" Nov 29 04:02:55 crc kubenswrapper[4749]: I1129 04:02:55.373809 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 04:02:55 crc kubenswrapper[4749]: I1129 04:02:55.374394 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.123386 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hlwsx/must-gather-bmrnf"] Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.124142 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" podUID="a97b28cd-23db-4aac-91de-8d4008cb0384" containerName="copy" containerID="cri-o://070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847" gracePeriod=2 Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.137721 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hlwsx/must-gather-bmrnf"] Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.747231 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlwsx_must-gather-bmrnf_a97b28cd-23db-4aac-91de-8d4008cb0384/copy/0.log" Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.747979 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.799000 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdhkg\" (UniqueName: \"kubernetes.io/projected/a97b28cd-23db-4aac-91de-8d4008cb0384-kube-api-access-qdhkg\") pod \"a97b28cd-23db-4aac-91de-8d4008cb0384\" (UID: \"a97b28cd-23db-4aac-91de-8d4008cb0384\") " Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.799339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a97b28cd-23db-4aac-91de-8d4008cb0384-must-gather-output\") pod \"a97b28cd-23db-4aac-91de-8d4008cb0384\" (UID: \"a97b28cd-23db-4aac-91de-8d4008cb0384\") " Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.805856 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97b28cd-23db-4aac-91de-8d4008cb0384-kube-api-access-qdhkg" (OuterVolumeSpecName: "kube-api-access-qdhkg") pod "a97b28cd-23db-4aac-91de-8d4008cb0384" (UID: "a97b28cd-23db-4aac-91de-8d4008cb0384"). InnerVolumeSpecName "kube-api-access-qdhkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.902333 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdhkg\" (UniqueName: \"kubernetes.io/projected/a97b28cd-23db-4aac-91de-8d4008cb0384-kube-api-access-qdhkg\") on node \"crc\" DevicePath \"\"" Nov 29 04:03:00 crc kubenswrapper[4749]: I1129 04:03:00.996362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a97b28cd-23db-4aac-91de-8d4008cb0384-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a97b28cd-23db-4aac-91de-8d4008cb0384" (UID: "a97b28cd-23db-4aac-91de-8d4008cb0384"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.004675 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a97b28cd-23db-4aac-91de-8d4008cb0384-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.064428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlwsx_must-gather-bmrnf_a97b28cd-23db-4aac-91de-8d4008cb0384/copy/0.log" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.065057 4749 generic.go:334] "Generic (PLEG): container finished" podID="a97b28cd-23db-4aac-91de-8d4008cb0384" containerID="070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847" exitCode=143 Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.065139 4749 scope.go:117] "RemoveContainer" containerID="070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.065276 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlwsx/must-gather-bmrnf" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.089470 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a97b28cd-23db-4aac-91de-8d4008cb0384" path="/var/lib/kubelet/pods/a97b28cd-23db-4aac-91de-8d4008cb0384/volumes" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.089645 4749 scope.go:117] "RemoveContainer" containerID="6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.148466 4749 scope.go:117] "RemoveContainer" containerID="070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847" Nov 29 04:03:01 crc kubenswrapper[4749]: E1129 04:03:01.148973 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847\": container with ID starting with 070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847 not found: ID does not exist" containerID="070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.149019 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847"} err="failed to get container status \"070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847\": rpc error: code = NotFound desc = could not find container \"070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847\": container with ID starting with 070baa05f1d7d15d773a722f84421625455ea393618a51a63252d2c3fb80d847 not found: ID does not exist" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.149046 4749 scope.go:117] "RemoveContainer" containerID="6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9" Nov 29 04:03:01 crc kubenswrapper[4749]: E1129 04:03:01.149318 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9\": container with ID starting with 6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9 not found: ID does not exist" containerID="6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9" Nov 29 04:03:01 crc kubenswrapper[4749]: I1129 04:03:01.149360 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9"} err="failed to get container status \"6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9\": rpc error: code = NotFound desc = could not find container \"6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9\": container with ID starting with 6262e83f237415213c40cfad18de3232d5580c806bcc732ed0ead5359fbd3ef9 not found: ID does not exist" Nov 29 04:03:25 crc kubenswrapper[4749]: I1129 04:03:25.373724 4749 patch_prober.go:28] interesting pod/machine-config-daemon-mnsct container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 29 04:03:25 crc kubenswrapper[4749]: I1129 04:03:25.374275 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 29 04:03:25 crc kubenswrapper[4749]: I1129 04:03:25.374333 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" Nov 29 04:03:25 crc kubenswrapper[4749]: I1129 04:03:25.375095 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"577fb5ad0e14eb75e16df5dc241a22829818fdfa2374e8c0dcd5dc8df5c702c1"} pod="openshift-machine-config-operator/machine-config-daemon-mnsct" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 29 04:03:25 crc kubenswrapper[4749]: I1129 04:03:25.375153 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" podUID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerName="machine-config-daemon" containerID="cri-o://577fb5ad0e14eb75e16df5dc241a22829818fdfa2374e8c0dcd5dc8df5c702c1" gracePeriod=600 Nov 29 04:03:26 crc kubenswrapper[4749]: I1129 04:03:26.359569 4749 generic.go:334] "Generic (PLEG): container finished" podID="800b3936-ba93-47d8-9417-2fdc5ce4d171" containerID="577fb5ad0e14eb75e16df5dc241a22829818fdfa2374e8c0dcd5dc8df5c702c1" exitCode=0 Nov 29 04:03:26 crc kubenswrapper[4749]: I1129 04:03:26.359599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerDied","Data":"577fb5ad0e14eb75e16df5dc241a22829818fdfa2374e8c0dcd5dc8df5c702c1"} Nov 29 04:03:26 crc kubenswrapper[4749]: I1129 04:03:26.360132 4749 scope.go:117] "RemoveContainer" containerID="feb39ce59afee02ed32a72ecc038decada4004fe6200bfe3bca921a1fa3ab24f" Nov 29 04:03:27 crc kubenswrapper[4749]: I1129 04:03:27.373913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mnsct" event={"ID":"800b3936-ba93-47d8-9417-2fdc5ce4d171","Type":"ContainerStarted","Data":"684cf20c97fd862a535aa953835e18f65019aee9fac49eb94c64168e344c952b"}